Meta (FACEBOOK, INC.) | Lack of Investment in Content Moderation in the Global Majority at Meta (FACEBOOK, INC.)

Status
Filed
Previous AGM date
Resolution details
Company ticker
FB
Lead filer
Resolution ask
Report on or disclose
ESG theme
  • Social
ESG sub-theme
  • Digital rights
  • Human rights
Type of vote
Shareholder proposal
Filer type
Shareholder
Company sector
Technology
Company HQ country
United States
Resolved clause
RESOLVED: Shareholders request that Meta Platforms Inc. (“Meta”) report to shareholders on the effectiveness of measures it is taking to prevent and mitigate human rights risks in its five largest non-US markets (based on number of users) relating to the proliferation of hate speech, disinformation, and incitement to violence enabled by its Instagram and Facebook platforms. The report should be issued no later than June 1, 2025, prepared at reasonable cost, omitting proprietary and confidential information (including information specifically relevant to litigation or legal enforcement action).
Whereas clause
WHEREAS: The dissemination of hatred that incites discrimination, hostility or violence violates international human rights standards1. Where content moderation systems have failed to effectively detect divisive content in non-English languages, there has been an associated increase in hate speech2, disinformation3, and incitement to violence. Meta’s stakeholders and the public have repeatedly raised significant concerns regarding what appears to be an obvious lack of ertionate investmendddeet in content moderation resources and expertise in Meta’s global majority markets. This issue, repeatedly flagged by reports from international organizations4, its own Oversight Board5 and CSOs6, is critical in Meta’s non-English speaking countries. This apparent lack of adequate resources and investment in content moderation is increasingly critical with the 2024 super election year and an estimated 2.6 billion people7 taking to the polls globally. Media reports suggest Meta is putting in place advertising related mitigations8 relating to the US elections. However, Meta has not published any measures to address such issues in non-Western, non-English speaking markets, that given the current inadequacy of effective content moderation are more vulnerable to the proliferation of hate speech, disinformation, and incitement to violence on their platforms.
We commend Meta’s first transparency reports on Instagram and Facebook required under the EU Digital Services Act, providing detailed information on numbers of content moderators in local languages and overall users per EU country. Given Meta now appears to have the required data collection and reporting infrastructure to provide such detailed reporting on individual countries, the company should expand these transparency measures to key markets like India and Brazil9 on a disaggregated basis to demonstrate the actual investment made to build multilingual capacity in content moderation. By doing so, Meta can address the persistent human rights risks which can and have had a negative impact on brand value and, indirectly, on its advertising revenue, as well as on diversified investment portfolios as viewed through a universal ownership lens.
Proponent suggests the report include data on the number of content moderators fluent in local languages in Instagram and Facebook’s five largest non-US markets based on number of users and an assessment by external, independent, and qualified experts of the effectiveness of Meta’s measures taken to meaningfully manage hateful content, disinformation, and incitement to violence on those platforms.
Supporting statement
1 https://www.un.org/en/hate-speech/united-nations-and-hate-speech/international-human-rights-law
2 https://www.bbc.com/news/world-africa-67275219
3 https://digital-strategy.ec.europa.eu/en/news/commission-sends-request-information-meta-under-digital-services-act
4 https://www.ohchr.org/sites/default/files/Documents/Issues/Opinion/Legislation/Case_2021_009-FB-UA.pdf
5 https://www.theguardian.com/technology/2022/dec/06/meta-protecting-business-partners
6 https://www.amnesty.org/en/latest/news/2023/10/meta-failure-contributed-to-abuses-against-tigray-ethiopia/
7 https://www.unesco.org/en/articles/ahead-super-election-year-unesco-appeals-governments-around-world-protect journalists-rights
8 https://www.reuters.com/technology/meta-bar-political-advertisers-using-generative-ai-ads-tools-2023-11-06/
9 https://www.statista.com/statistics/578364/countries-with-most-instagram-users/ & https://www.digitalmarketingcommunity.com/indicators/instagram-active-users-penetrations-2018/

How other organisations have declared their voting intentions

Organisation name Declared voting intentions Rationale
Comgest For https://www.comgest.com/-/media/comgest/esg-library/esg-en/2024-proxy-voting-pre-declaration.pdf

DISCLAIMER: By including a shareholder resolution or management proposal in this database, neither the PRI nor the sponsor of the resolution or proposal is seeking authority to act as proxy for any shareholder; shareholders should vote their proxies in accordance with their own policies and requirements.

Any voting recommendations set forth in the descriptions of the resolutions and management proposals included in this database are made by the sponsors of those resolutions and proposals, and do not represent the views of the PRI.

Information on the shareholder resolutions, management proposals and votes in this database have been obtained from sources that are believed to be reliable, but the PRI does not represent that it is accurate, complete, or up-to-date, including information relating to resolutions and management proposals, other signatories’ vote pre-declarations (including voting rationales), or the current status of a resolution or proposal. You should consult companies’ proxy statements for complete information on all matters to be voted on at a meeting.