Meta (FACEBOOK, INC.) | Report on enforcement of community standards and user content at Meta (FACEBOOK, INC.)

Status
7.16% votes in favour
AGM date
Previous AGM date
Proposal number
10
Resolution details
Company ticker
FB
Lead filer
Resolution ask
Report on or disclose
ESG theme
  • Social
ESG sub-theme
  • Conflict and/or violence
  • Digital rights
Type of vote
Shareholder proposal
Filer type
Shareholder
Company sector
Technology
Company HQ country
United States
Resolved clause
Shareholders request the Board, at reasonable expense and excluding proprietary or legally privileged information, prepare and publish a report analyzing why the enforcement of “Community Standards” as described in the “Transparency Center” has proven ineffective at controlling the dissemination of user content that contains or promotes hate speech, disinformation, or content that incites violence and/or causes harm to public health or personal safety.
Whereas clause
The Meta (formerly Facebook) brand has continued to be wracked by management missteps and lack of Board oversight, resulting in continued harm from its platforms including:
- Millions of high-profile users exempted from its rules,[1] permitting continued widespread incitement of violence and harrassment;
- Internal Company research demonstrating that Instagram harms teenage girls;[2]
- Mental health crises among outsourced moderators[3] due to viewing child pornography and animal cruelty;
- Lack of cooperation with authorities to prevent and detect child exploitation and abuse;[4]
- The spread of election misinformation despite clear warnings;[5]
-The amplification of political advertisements containing deliberate lies and mistruths;[6],[7]
- Hate speech that continues to thrive;[8]
- Anti-immigrant violence[9] around the world; and
- Lax enforcement of age requirements in the Company’s metaverse platforms, despite evidence that the metaverse is deeply harmful to children’s cognitive development.[10]

Meta has the technological solutions to stop these types of abuses but chooses not to deploy them. A 2021 whistleblower complaint filed with the Securities and Exchange Commission[11] argues the Company has failed to adequately warn investors about the material risks of dangerous and criminal behavior, terrorist content, hate speech, and misinformation on its sites. Company failure to control these activities reflects a grave lack of oversight by management and the board. Despite establishing an internal Oversight Board, the Company’s platforms continue to harm society and users, and creates investor risk. An internal review of company practices highlighting harassment and incitement to violence states, “We are not actually doing what we say we do publicly,” and deems company’s actions “a breach of trust.”
Management has attempted to address the material risk of dangerous user content through the creation of its “Transparency Center”[13] which displays qualitative and quantitative reports on the elimination of posts violating one of the 25 “Community Standards.” Shareholders applaud this action, yet it appears to be ineffective given ongoing harms.
Supporting statement
Proponent suggests the report include for each of Meta’s products, including Facebook, Messenger, Instagram, WhatsApp, and others with over 100 million users:
- A quantitative and qualitative assessment by external, independent, and qualified experts of the effectiveness of Meta’s algorithms, staff, and contractors to locate and eliminate content violating Community Standards;
- Examination of benefits to users and impact to revenue if Company voluntarily follows existing legal frameworks established for broadcast networks (e.g. laws governing child pornography and political advertisements); and
- Analysis of the benefits of the Company continuing to conduct technology

How other organisations have declared their voting intentions

Organisation name Declared voting intentions Rationale
EFG Asset Management For A vote FOR this proposal is warranted. Shareholders would benefit from increased transparency and disclosure
on how the company is managing material risks related to misinformation and harmful content.
Rothschild & co Asset Management For

DISCLAIMER: By including a shareholder resolution or management proposal in this database, neither the PRI nor the sponsor of the resolution or proposal is seeking authority to act as proxy for any shareholder; shareholders should vote their proxies in accordance with their own policies and requirements.

Any voting recommendations set forth in the descriptions of the resolutions and management proposals included in this database are made by the sponsors of those resolutions and proposals, and do not represent the views of the PRI.

Information on the shareholder resolutions, management proposals and votes in this database have been obtained from sources that are believed to be reliable, but the PRI does not represent that it is accurate, complete, or up-to-date, including information relating to resolutions and management proposals, other signatories’ vote pre-declarations (including voting rationales), or the current status of a resolution or proposal. You should consult companies’ proxy statements for complete information on all matters to be voted on at a meeting.