Uncategorized

Pentagon declares Anthropic a national security risk, banning every military contractor from working with the company immediately

Photo by Anthropic, licensed under CC BY-SA 2.0

Anthropic will fight the decision on court.

The Pentagon has declared leading AI company Anthropic a national security threat, blocking all federal agencies and military contractors from doing business with the company. This move by the Trump administration came after a tense week of negotiations between Anthropic and the Defense Department.

President Donald Trump criticized Anthropic on Truth Social, calling the company a “DISASTROUS MISTAKE” for trying to “STRONG-ARM the Department of War” and prioritize their terms of service over the Constitution. According to The Washington Post, Trump stated that Anthropic’s “selfishness is putting AMERICAN LIVES at risk, our Troops in danger, and our National Security in JEOPARDY.”

Defense Secretary Pete Hegseth then took to social media to declare Anthropic a “supply-chain risk,” stating: “Effective immediately, no contractor, supplier, or partner that does business with the United States military may conduct any commercial activity with Anthropic.”

The dispute comes down to Anthropic’s refusal to let the military use its AI without limits

At the heart of the conflict is Anthropic’s refusal to allow the Pentagon to use its AI system, Claude, for any purpose allowed by law. Anthropic insisted on protections against its technology being used for fully autonomous weapons or large-scale domestic surveillance.

Anthropic responded by saying in a blog post that it plans to fight the ban in court. The company believes the ban is not permitted by federal law and that the “supply-chain risk” label is “legally unsound.” Anthropic called it an “unprecedented action,” usually reserved for foreign adversaries, adding, “We are deeply saddened by these developments.”

The conflict escalated after Anthropic CEO Dario Amodei stated in a blog post that he would not give in to the Pentagon’s demands. The Defense Department’s technology chief, Emil Michael, responded on social media by publicly attacking Anthropic’s CEO online, calling Amodei a “liar with a god complex.” This places a leading American AI company in the same category as Chinese and Russian firms often seen as threats to the United States.

President Trump said federal agencies have a six-month period to phase out Anthropic’s technology, and vaguely threatened “major civil and criminal consequences” if Anthropic doesn’t “get their act together.” However, the Pentagon has not fully explained the legal basis for compelling military contractors to immediately cut ties with Anthropic. Trump has previously used executive orders to reshape federal energy policy in similarly sweeping ways.

More than 550 employees at Google and OpenAI had signed an open letter supporting Anthropic’s stance, urging their companies to stand up to the Pentagon. Meanwhile, OpenAI CEO Sam Altman announced that his company had reached a deal with the Defense Department.

Altman stated that OpenAI’s safety principles include “prohibitions on domestic mass surveillance and human responsibility for the use of force, including for autonomous weapon systems,” and that the Pentagon “agrees with these principles.”

Despite its focus on safety, Anthropic had been actively working with the military and intelligence agencies. The company was the first to work on classified government systems through a partnership with Amazon and Palantir, secured a $200 million Pentagon contract under Trump, and agreed to supply Claude to civilian agencies for just a dollar.

Experts warn this conflict could complicate the administration’s relationships with other AI developers, especially in defense. Gregory Allen of the Wadhwani AI Center noted that a previous fallout between the Pentagon and Silicon Valley over Google’s Project Maven took years to repair, warning, “You don’t want to pointlessly light that on fire.”


Attack of the Fanboy is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more about our Affiliate Policy




Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button