SAIL Media

SAIL Media

Dean Ball on open models and government control

Subtle precedents on the future of open models set by the unfolding Anthropic v. Department of War case.

Nathan Lambert and Dean W. Ball
Mar 06, 2026
∙ Paid
This post originally appeared in Interconnects.

“If AI is the most powerful technology, why would any global entity let a single U.S. company (or government) control their relationship to it?”

Watching history unfold between Anthropic and the Department of War (DoW) it has been obvious to me that this could be a major turning point in perspectives on open models, but one that’ll take years to be obvious. As AI becomes more powerful, existing power structures will grapple with their roles relative to existing companies. Some in open models frame this as “not your weights, not your brain,” but it points to a much bigger problem when governments realize this.

If AI is the most powerful technology, why would any global entity let a single U.S. company (or government) control their relationship to it?

I got Dean W. Ball of the great Hyperdimensional newsletter onto the SAIL Media weekly Substack live to discuss this. In the end, we agree that the recent actions by the DoW — especially the designation of Anthropic as a supply chain risk (which Dean and I both vehemently disagree with) — points to open models being the 5-10 year stable equilibrium for power centers.

Keep reading with a 7-day free trial

Subscribe to SAIL Media to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
Dean W. Ball's avatar
A guest post by
Dean W. Ball
I write about AI, emerging technology, and the future of governance.
Subscribe to Dean
© 2026 SAIL media, LLC · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture