MICROSOFT CORPORATION | Defense Customer Use of Microsoft Technology

AGM date
Proposal number
Resolution details
Company ticker
Resolution ask
Conduct due diligence, risk or impact assessment
ESG theme
  • Social
ESG sub-theme
  • Conflict
  • Human rights & inequality
  • Product responsibility / Privacy
Company sector
Company HQ country
United States
Resolved clause
Resolved, that the board commission an independent report to assess whether governmental customer use of Microsoft’s technology, including defense contract use, does or can contribute to violations of privacy, civil and human rights, and conflicts with the policies and principles set forth in Microsoft’s CSR Report and other public disclosures.
Whereas clause
shareholders are concerned about potential harms to the company and society as set forth above, we urge shareholders to vote in favor of the following resolution:
Supporting statement
One-third of managed assets incorporate sustainability criteria, representing $17 trillion in investments.1 More responsible fund dollars are invested in Microsoft than in any other single company.2 Microsoft advances the UN Sustainable Development Goals, established principles for responsible use of artificial intelligence (“AI”), and recognizes privacy as a “fundamental human right.”3

Microsoft demands the same ethics in its supply chain: “When it comes to labor and human rights, we leave no doubt as to the standards we expect. Our standards apply to all our suppliers.”4 However, military customers of Microsoft’s products and services may use the company’s technology in ways that conflict with Microsoft’s policies or otherwise raise concerns. Microsoft has established principles for responsible use of AI; however, these do not cover all products and services related to military contracts to address the company’s operational exposure to human rights violations potentially involved in military operations and missions.

Risks in working with government customers on military contracts include weaponization of the company’s technology, supplanting human decision-making with artificial intelligence, and using its products to gamify warfare and for surveillance. Absence of standards can result in privacy, civil, and human rights violations, and circumvents legal requirements including international legal standards for warfare. This disproportionately impacts the rights of people of color, activists, and immigrants and the mental health and suicide rates of government personnel and contractors, service members and veterans.

Contracts that raise concerns:

• Microsoft’s HoloLens product recently moved from a prototype tested by the U.S. Army to a $21 billion Integrated Visual Augmentation System (IVAS) production contract for a military version to enable enhanced vision using AI-powered technology. The Army intends its close-combat lethality units to use IVAS in warfare to achieve “overmatch” against enemy forces.

• The prospective, 10-year, $10 billion contract for the Joint Enterprise Defense Infrastructure (“JEDI”) with Department of Defense will provide cloud services to assist with development of AI capabilities to operationalize warfare.

In addition to brand, reputational and financial risk, company employees have protested the conversion of their work-products to tools of war and surveillance.5 Leaving these concerns unaddressed impacts employee morale, compromises work quality and productivity, and hamstrings recruitment efforts.