Anthropic’s AI Security Row: Pentagon May Cut Ties With Firm Over Dispute, Report Says

Anand Kumar
By
Anand Kumar
Anand Kumar
Senior Journalist Editor
Anand Kumar is a Senior Journalist at Global India Broadcast News, covering national affairs, education, and digital media. He focuses on fact-based reporting and in-depth analysis...
- Senior Journalist Editor
4 Min Read

The Pentagon and San Francisco-based artificial intelligence firm Anthropic are locked in a dispute over the agency’s use of the latter’s flagship AI model, Claude. The dispute, which concerns the agency’s refusal to let it use the cloud for “all legitimate purposes” without security restrictions, has prompted a Pentagon official to say the agency may cut ties with the company and declare it a “supply chain risk” over the dispute.

Dario Amodei, co-founder and CEO of Anthropic in Bangalore, India, on Monday, February 16. (Bloomberg)A Pentagon official, speaking anonymously to Axios, revealed that the agency could cut ties and declare the company a “chain risk,” which effectively means any company that does business with the Pentagon can have no business relationship with Anthropic.

“It’s going to be a huge pain in the ass, and we’re going to make sure they pay the price for forcing our hand in this way,” the official told Axios, adding that the Pentagon is “close” to cutting ties.

However, there are logistical issues with this decision as the cloud is the only AI model currently used in the US military’s classified system and has been widely praised for its effectiveness. To replace it with another, the Pentagon would have to make new contracts with companies that could be as efficient as Claude.

Indeed, effectiveness seems to be a major issue in this regard, as other competing AI models, such as xAI, OpenAI and Google have already agreed to remove security measures but have not yet been deployed in the military.

How the Pentagon could sever relations with anthropological implicationsPotential isolation effects will only marginally affect anthropology. Axios reports that the deal in question is worth about $200 million in annual revenue, which is almost negligible compared to $14 billion in annual revenue. However, declaring it a “supply chain risk” could impact the firm as it would force other companies to cancel their ties.

By all accounts, War Department officials are showing no signs of faltering, though Anthropic indicates that talks are moving in a “productive” direction.

“The War Department’s relationship with Anthropic is under review,” a War Department spokesman said. “Our nation needs our partners to be willing to help our warfighters win any fight. Ultimately, this is about the safety of our troops and the American people.”

Also Read: US Uses Anthropic’s Cloud AI in Venezuela to Capture Nicolas Maduro: Report

What is the controversy about: ExplainedThe dispute was not resolved after months of meetings between Anthropic and Pentagon officials, who expressed concern over the conditions under which the military could use Claude. While the company wants unrestricted use for “all lawful purposes,” Anthropic CEO Dario Amodei expressed concern about surveillance and privacy violations.

According to Axios, Anthropic wants terms in the deal that would prevent the agency and the military from conducting mass surveillance on Americans or developing weapons that can be fired without human intervention.

Designating a company as “supply chain risk” is a big step usually reserved for foreign business counterparts. It seems, at some point, one has to sit back and relax as officials admit that other AI models are “simply lagging behind” when it comes to conducting special military operations.

TAGGED:
Share This Article
Anand Kumar
Senior Journalist Editor
Follow:
Anand Kumar is a Senior Journalist at Global India Broadcast News, covering national affairs, education, and digital media. He focuses on fact-based reporting and in-depth analysis of current events.
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *