Anthropic targets up to one million Google TPUs by 2026 for AI expansion
Anthropic is gearing up for a hardware push that looks anything but small. By 2026 the company says it wants to lock in access to up to one million of Google’s Tensor Processing Units, the custom chips that run many large-language models today. The plan isn’t just an upgrade; Anthropic talks about spending “tens of billions” of dollars, a sum that would probably bring more than one gigawatt of new computing power online.
The timeline feels straightforward: get the TPUs, scale the racks, and have the machines humming within three years. That partnership with Google, hinted at by the sheer TPU count, seems to point to a deeper reliance on the cloud provider’s silicon. Still, it’s unclear how the money will be split between research, production and safety work.
What we do know is that Anthropic’s roadmap now leans heavily on a hardware haul that dwarfs most AI expansions we’ve seen so far, and that alone could reshape how quickly they move forward.
Anthropic is preparing for a major expansion of its AI infrastructure, aiming to secure access to up to one million of Google's TPUs (Tensor Processing Units) by 2026. The company plans to invest "tens of billions" of dollars in this effort, which would add more than one gigawatt of new computing capacity. Anthropic reports serving over 300,000 business customers, and the number of large accounts—those spending more than $100,000 per year—has grown nearly sevenfold in the past year.
Krishna Rao, Anthropic's CFO, calls the relationship with Google a "longstanding partnership" that will deepen as the company expands its footprint. As the demand for Claude increases, so does the need for more computing power. "This expansion will help us serve this rapidly growing customer demand," Anthropic writes.
Chip partnerships with Amazon and Google Even with this new capacity from Google, Amazon remains Anthropic's "primary training partner and cloud provider." Through the Rainier project, Anthropic already accesses a large compute cluster made up of hundreds of thousands of AI chips across multiple US data centers. Anthropic also plans to invest in additional compute resources beyond Google.
Anthropic says it will pour tens of billions into snagging up to a million Google TPUs by 2026 - enough to push its compute over the one-gigawatt mark. Today it serves more than 300,000 business customers, and its large-account tier (those spending over $100,000 a year) has apparently ballooned almost seven-fold in the last twelve months. That growth sounds impressive, but the sheer size of the spend raises a lot of eyebrows.
It’s hard to tell whether the extra TPU capacity will actually turn into proportionally bigger revenue or market share. The plan also leans on a steady flow of Google hardware and predictable pricing - both of which feel a bit shaky at the moment. So, while the ambition is crystal clear, the financial and operational payoff of a gigawatt-scale TPU fleet remains an open question.
We’ll need more data before we can say if this gamble pays off.
Common Questions Answered
What is the specific timeline for Anthropic's plan to secure Google TPUs?
Anthropic aims to secure access to up to one million Google Tensor Processing Units by the year 2026. This timeline is a core part of their strategy for a major AI infrastructure expansion.
How much new computing capacity will Anthropic's investment in TPUs add?
The investment of tens of billions of dollars is projected to bring more than one gigawatt of new computing capacity online. This significant power increase is intended to support the scaling of their AI operations.
What growth has Anthropic reported in its large-account customer base?
Anthropic has reported that the number of large accounts, defined as those spending over $100,000 per year, has grown nearly sevenfold in the past year. This growth is part of the context for their massive infrastructure investment, as they currently serve over 300,000 business customers.