Technology

The Pentagon has prohibited contractors from using Anthropic technology in projects related to the US military.

Published On Fri, 06 Mar 2026
Siddharth Malhotra
5 Views
news-image
Share
thumbnail

The Pentagon has officially labeled artificial intelligence company Anthropic as a “supply chain risk,” restricting the use of its technology in work related to the US military. According to a source, the company’s AI tools had been used in support of military operations, including activities linked to Iran. Anthropic confirmed the designation on Thursday, saying it takes effect immediately and prevents government contractors from using the company’s technology in projects for the US Department of Defense.

Anthropic’s CEO Dario Amodei clarified that the restriction is limited in scope. Contractors may still use the company’s AI system, Claude, for projects that are not connected to Pentagon contracts. He emphasized that the rule applies only when Claude is used directly as part of agreements with the Defense Department.

The decision follows months of disagreements between Anthropic and the Pentagon regarding the company’s insistence on maintaining strict safeguards on its AI technology. The Defense Department, which the Donald Trump administration refers to as the Department of War, argued that some of these protections were too restrictive. Amodei stated that Anthropic plans to challenge the designation through legal channels.

Amodei also revealed that the company and Pentagon officials recently discussed possible ways to stop the military from using Claude under current arrangements. At the same time, they explored whether Anthropic could continue cooperating with the military while keeping its safety restrictions in place. Pentagon Chief Technology Officer Emil Michael later wrote on the platform X that the Defense Department is not currently negotiating with Anthropic. Amodei also apologized for an internal memo published by the tech news outlet The Information. In the memo, he suggested that some Pentagon officials were unhappy with the company partly because it had not offered strong praise for Trump. The memo’s release reportedly created concern among Anthropic’s investors, who attempted to manage the fallout from the dispute.

The Defense Department did not immediately comment on the situation. The move is seen as a rare and serious action by the US government against a domestic technology company that had previously worked closely with the Pentagon. Despite the designation, sources say the military continues to rely on Anthropic’s AI for certain operational tasks. The company’s Claude system is believed to help analyze intelligence and assist with planning military operations.

A spokesperson for Microsoft said the company’s legal team reviewed the decision and concluded that Anthropic’s products, including Claude, can still be offered to customers outside the Defense Department through services such as M365, GitHub, and Microsoft’s AI Foundry platform. Another major Anthropic investor, Amazon, did not immediately respond to requests for comment. Meanwhile, software platform Maven Smart Systems developed by Palantir Technologies reportedly uses workflows and prompts built with Claude for intelligence analysis and targeting tasks.

Anthropic had previously been one of the most proactive AI companies seeking collaboration with US national security agencies. However, tensions have grown in recent months over how the military intends to use the company’s technology on the battlefield. The company has refused to remove restrictions that prevent its AI from being used for autonomous weapons or large-scale domestic surveillance. The Pentagon has argued that it should be allowed to use the technology as long as it complies with US law. With the new designation, Anthropic has been placed in a category that Washington has historically applied to foreign rivals. Similar action was previously taken against the Chinese telecommunications giant Huawei, which was removed from Pentagon supply chains over security concerns.

Disclaimer: This image is taken from Reuters.