Pentagon ban of Anthropic faces judge; Claude AI maker seeks injunction
Anthropic seeks legal relief from Pentagon's AI ban

Anthropic is challenging the Pentagon's recent ban on its Claude AI models in federal court, seeking a preliminary injunction to prevent the enforcement of a supply chain risk designation and a directive from former President Trump that prohibits federal agencies from using its technology. The company argues that without this legal relief, it risks losing billions in contracts and facing reputational damage, while the Pentagon maintains it does not use the AI for controversial purposes.
Key Takeaways
- 1.
Anthropic claims it could lose billions in business without the injunction.
- 2.
The Pentagon's designation as a supply chain risk is unprecedented for a U.S. company.
- 3.
Former President Trump ordered federal agencies to cease using Anthropic's technology.
Get your personalized feed
Trace groups the biggest stories, videos, and discussions into one feed so you can stay current without scanning ten tabs.
Try Trace free