• Sun, Mar 2026

OpenAI’s Sam Altman announces Pentagon

OpenAI’s Sam Altman announces Pentagon

OpenAI strikes a deal with the Pentagon just hours after Trump orders the end of Anthropic contracts, and hours after a staff all-hands

Sam Altman said that his business, OpenAI, had recently "reached an agreement with the Department of War to deploy our models in their classified network" in an X post on Friday night. Altman is a type of poster boy for AI at war because of the shocking and important timing. 

This comes after a high-profile confrontation between OpenAI's rival Anthropic and the Department of Defense, which is now called as the Department of War under the Trump administration. While Anthropic aimed to set a boundary between completely autonomous weaponry and widespread domestic surveillance, the Pentagon pressed AI companies, including Anthropic, to permit the use of its models for "any lawful purposes."

This comes after a high-profile confrontation between OpenAI's rival Anthropic and the Department of Defense, which is now called as the Department of War under the Trump administration. While Anthropic aimed to set a boundary between completely autonomous weaponry and widespread domestic surveillance, the Pentagon pressed AI companies, including Anthropic, to permit the use of its models for "any lawful purposes."

The company "never raised objections to particular military operations nor attempted to limit use of our technology in an ad hoc manner," according to a lengthy statement issued Thursday by Anthropic CEO Dario Amodei. However, he contended that "in a narrow set of cases, we believe AI can undermine, rather than defend, democratic values."

Anthropic, OpenAI's main rival, was informed just hours earlier that the Pentagon had classified its products as a "supply-chain risk to national security," a kind of blacklisting. The Pentagon finds Anthropic's declaration of "red lines" about the use of its technology for fully autonomous weapons and mass surveillance intolerable. Therefore, no business that collaborates with the Pentagon in any way "may conduct any commercial activity with Anthropic," according to Secretary of War, Pete Hegseth.

Anthropic stated on Friday that it will "challenge any supply chain risk designation in court" but that it has "not yet received direct communication from the Department of War or the White House on the status of our negotiations."

In an unexpected post on X, Altman asserted that OpenAI's new defense contract contains safeguards that deal with the same problems that Anthropic found problematic. 


"Prohibitions on domestic mass surveillance and human responsibility for the use of force, including for autonomous weapon systems, are two of our most important safety principles," Altman stated. "The DoW upholds these values, incorporates them into our agreement, and reflects them in law and policy."

In addition to deploying engineers with the Pentagon "to help with our models and to ensure their safety," Altman stated that OpenAI "will build technical safeguards to ensure our models behave as they should, which the DoW also wanted." 

Altman continued, "We are asking the DoW to offer these same terms to all AI companies, which we think everyone should be willing to accept." "We have made it clear that we strongly want to see things de-escalate away from governmental and legal actions and toward reasonable agreements."

In theory, Anthropic's loss is Sam Altman's gain. To give some context, the sheer fact that Anthropic was founded as essentially a spin-off from OpenAI and purportedly committed to safety and ethical norms that Amodei and his colleagues believed OpenAI had not fulfilled is a slight to Altman. Therefore, it doesn't seem that the Super Bowl commercials where Anthropic disparaged OpenAI were the result of a friendly rivalry. The apparent enmity between Altman and Dario Amodei, the founder and CEO of Anthropic, is poorly disguised. The two blatantly refused to shake hands during a picture opportunity for AI leaders in India earlier this month.

Your experience on this site will be improved by allowing cookies Cookie Policy