SAIL Media

SAIL Media

Arcee AI goes all-in on open models built in the U.S.

Interconnects interview #16 to celebrate the release of Trinity Large.

Nathan Lambert
Jan 29, 2026
∙ Paid
This post originally appeared in Interconnects.

“Trinity Large is the first publicly shared training run at this scale on Nvidia Blackwell—400B total parameters, 13B active, trained to 17 trillion tokens.”

Arcee AI is a the startup I’ve found to be taking the most real approach to monetizing their open models. With a bunch of experience (and revenue) in the past in post-training open models for specific customer domains, they realized they needed to both prove themselves and fill a niche by pretraining larger, higher performance open models built in the U.S.A. They’re a group of people that are most eagerly answering my call to action for The ATOM Project, and I’ve quickly become friends with them.

Now, they’re releasing their flagship model — Trinity Large — as the culmination of this pivot. In anticipation of this release, I sat down with their CEO Mark McQuade, CTO Lucas Atkins, and pretraining lead, Varun Singh, to have a wide ranging conversation on:

Keep reading with a 7-day free trial

Subscribe to SAIL Media to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2026 SAIL media, LLC · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture