FACEBOOK, INC. | Child Sexual Exploitation Online at FACEBOOK, INC.

AGM date
Proposal number
11
Resolution details
Company ticker
FB
Lead filer
Resolution ask
Conduct due diligence, risk or impact assessment
ESG theme
  • Social
ESG sub-theme
  • Health, safety and well being
  • Human rights & inequality
Company sector
Technology
Company HQ country
United States
Resolved clause
Shareholders request that the Board of Directors issue a report by February 2023 assessing the risk of increased sexual exploitation of children as the Company develops and offers additional privacy tools such as end-to-end encryption. The report should address potential adverse impacts to children (18 years and younger) and to the company’s reputation or social license, assess the impact of limits to detection technologies and strategies, and be prepared at reasonable expense and excluding proprietary/confidential information.
Whereas clause
Child sexual exploitation online (and Child Sexual Abuse Material—CSAM) is an escalating threat to children worldwide. The exponential growth of CSAM is directly tied to the growth of social media and the increasing number of children online.1 In 2020, the National Center for Missing and Exploited Children (NCMEC) received 21.7 million reports of CSAM. Of these, 20.3 million reports–or 94 percent–stem from Facebook and its platforms, including Messenger and Instagram.2 This represents an increase of 28 percent from Facebook’s nearly 17 million reports in 2019. Facebook’s plan to apply end-to-end encryption to all of its messaging platforms set off a storm of criticism. Government agencies, law enforcement, and child protection organizations worldwide claim that it will cloak the actions of child predators, make children more vulnerable, and that millions of CSAM incidents will go unreported.3 Facebook touts its leadership in combating CSAM, yet NCMEC estimates that Facebook’s end- to-end encryption plans could effectively make invisible 70 percent of CSAM cases. Facebook’s encryption takes on more urgency as COVID has led to a significant increase in CSAM and grooming activities.4 Facebook whistleblower Frances Haugen said Facebook’s efforts to remove CSAM were inadequate and under-resourced.5 Monika Bickert, Facebook’s Vice President of Global Policy Management, testified in the British House of Commons and was asked how many CSAM cases would disappear if the company implements end-to-end encryption. Ms. Bickert replied that she didn’t know but if it’s content we cannot see then it’s content we cannot report.6 A letter from 120+ child protection organizations wrote Facebook saying its encryption plans presents an unacceptable risk to children, and would arguably make your services unsafe. 7 Law enforcement leaders worldwide rely heavily on Facebook’s tips to pursue online child predators and have contacted Facebook raising concerns that its encryption plan would make it unable to track millions of CSAM cases and be harder to identify both victims and abusers.8,9 The U.S., UK and other countries have proposed legislation wherein companies could lose civil liability protections for CSAM and make it easier to sue platforms that knowingly facilitated child sex trafficking and exploitation.10, 11, 12 In 2020, 79 percent of U.S. underage sex trafficking victims recruited online were recruited through Facebook or Instagram.13, 14 The proponents support online privacy. But, like many others, our concern is that it should not come at the cost of child safety, as well as and potential regulatory, reputational and legal risk to Facebook.