Meta (FACEBOOK, INC.) | Report on child safety impacts and actual harm reduction at Meta (FACEBOOK, INC.)

Status
16.28% votes in favour
AGM date
Previous AGM date
Proposal number
11
Resolution details
Company ticker
FB
Lead filer
Resolution ask
Report on or disclose
ESG theme
  • Social
ESG sub-theme
  • Digital rights
Type of vote
Shareholder proposal
Filer type
Shareholder
Company sector
Technology
Company HQ country
United States
Resolved clause
Resolved: Shareholders request that, within one year, the Board of Directors adopts targets and publishes annually a report (prepared at reasonable expense, excluding proprietary information) includes quantitative metrics appropriate to assessing whether Meta has improved its performance globally regarding child safety impacts and actual harm reduction to children on its platforms.
Supporting statement
The internet was not developed with children in mind. Social media impacts children’s brains differently than adult brains.1 It also poses physical and psychological risks that many children and teens are unprepared for, including sextortion and grooming, hate group recruitment, human trafficking (for any means), cyberbullying and harassment, exposure to sexual or violent content, invasion of privacy, self- harm content, and financial scams, among others.
Meta is the world’s largest social media company with billons of children and teen users. Meta’s platforms, including Facebook, Instagram, Messenger and WhatsApp, have been linked to numerous child safety impacts and social policy challenges, including:
Mental Health:Meta’s own company research showed Instagram’s negative impacts on teens’ self-image, increased rates of depression and anxiety, and a link to suicidal thoughts. The Wall St. Journal concluded that these Instagram documents revealed “Facebook has made minimal efforts to address these issues and plays them down in public.” 2
Sexual Exploitation:In 2021, nearly 29 million cases of online child sexual abuse material were reported; nearly 27 million of those (92 percent) stemmed from Meta platforms-- including Facebook, WhatsApp, Messenger and Instagram.3 A Forbes report on Instagram pedophiles described Instagram as “a marketplace for sexualized images of children.”4
Cyberbullying:Time Magazine reported that “By one estimate, nearly 80% of teens are on Instagram and more than half of those users have been bullied on the platform.”5 A UK study found that Instagram accounted for 42 percent of online bullying, followed by Facebook with 39 percent.6
Data Privacy:In September 2022, Meta was fined over $400 million for failing to safeguard children’s information on Instagram7
Legislative Response:There is bipartisan Congressional support for the Kids Online Safety Act which will require companies to “act in kids’ best interests and prevent or mitigate the risk of certain harms including suicide, eating disorders and substance abuse.”8 The UK Online Safety bill aims to keep internet users safe from fraudulent and harmful content and prevent children, in particular, from accessing damaging material.
The European Union’s Digital Services Act will make identifying, reporting and removing child sexual abuse material mandatory.9
Meta is facing increasing regulatory, reputational, and legal risks due to these unabated issues.
Meta states that it has no tolerance for child exploitation or bullying and is developing new child safety features for selected products and age groups. Yet, Meta has no publicly available, company-wide child safety or harm reduction performance targets for investors and stakeholders to judge the effectiveness of Meta’s announced tools, policies and actions.

1 https://www.apa.org/news/apa/2022/social-media-children-teens
2 https://www.wsj.com/articles/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739
3 https://www.missingkids.org/content/dam/missingkids/pdfs/2021-reports-by-esp.pdf
4 https://www.forbes.com/sites/thomasbrewster/2022/06/25/meta-is-having-a-tough-time-keeping-pedophiles-off- instagram/?sh=7c02cbf45765
5 https://time.com/5619999/instagram-mosseri-bullying-artificial-intelligence/
6 https://techjury.net/blog/cyberbullying-statistics/
7 https://www.cnet.com/news/privacy/meta-fined-400m-for-failing-to-protect-childrens-privacy-on-instagram/
8 https://www.cnbc.com/2022/02/16/new-bill-would-require-facebook-google-and-others-to-protect-children.html
9 https://www.nytimes.com/2022/04/28/opinion/social-media-facebook-transparency.html?smid=em-share
 

How other organisations have declared their voting intentions

Organisation name Declared voting intentions Rationale
EFG Asset Management For A vote FOR this proposal is warranted, as additional disclosure on how the company measures and tracks
metrics related to child safety on the company's platforms would give shareholders more information on how
well the company is managing related risks.
Rothschild & co Asset Management For

DISCLAIMER: By including a shareholder resolution or management proposal in this database, neither the PRI nor the sponsor of the resolution or proposal is seeking authority to act as proxy for any shareholder; shareholders should vote their proxies in accordance with their own policies and requirements.

Any voting recommendations set forth in the descriptions of the resolutions and management proposals included in this database are made by the sponsors of those resolutions and proposals, and do not represent the views of the PRI.

Information on the shareholder resolutions, management proposals and votes in this database have been obtained from sources that are believed to be reliable, but the PRI does not represent that it is accurate, complete, or up-to-date, including information relating to resolutions and management proposals, other signatories’ vote pre-declarations (including voting rationales), or the current status of a resolution or proposal. You should consult companies’ proxy statements for complete information on all matters to be voted on at a meeting.