Skip to content

Amazon Just Put Another $25 Billion Into Anthropic. The Repayment Is Four Times Larger.

DS
LDS Team
Let's Data Science
11 min
The initial five billion dollar check landed Monday, April 20. Up to twenty billion more is tied to commercial milestones. Over the next ten years, Anthropic will spend over $100 billion on AWS and run Claude on up to 5 gigawatts of Amazon silicon.

On Monday morning, Amazon and Anthropic posted near-identical blog entries about ninety seconds apart.

Andy Jassy's was titled "Amazon and Anthropic expand strategic collaboration." Dario Amodei's was titled "Anthropic and Amazon expand collaboration for up to 5 gigawatts of new compute." Both disclosed the same numbers. Amazon would wire an additional five billion dollars into Anthropic, at a valuation of three hundred eighty billion dollars, with up to twenty billion more tied to unspecified "commercial milestones."

Anthropic would spend over $100 billion on Amazon Web Services over the next decade and run Claude on nearly 1 gigawatt of Trainium2 and Trainium3 capacity by the end of 2026, with up to 5 gigawatts of total capacity secured.

Within three hours, Amazon's stock was up 2.8% in premarket trading, pushing shares past $255. Bank of America analysts told clients that the initial Anthropic tranche alone could add about 1.3 billion dollars in AWS revenue in the first quarter.

The Deal Runs Both Ways

The headline number is twenty-five billion dollars going into Anthropic. The number that matters more is the $100 billion going back out.

Amazon's prior investment in Anthropic stood at $8 billion, spread across three earlier rounds. The new commitment, if fully triggered, brings the cumulative total to roughly thirty-three billion dollars and makes Amazon one of the largest non-founder shareholders in a foundation model lab.

At Anthropic's latest valuation of three hundred eighty billion dollars, the initial five billion buys about 1.3% of the company.

The counter-commitment is four times larger. Over ten years, Anthropic has agreed to spend more than $100 billion on AWS technology, including Trainium2, Trainium3, and the still-unreleased Trainium4 chips, along with Graviton CPUs. Anthropic currently operates over one million Trainium2 chips to train and serve Claude, according to its own announcement.

The structure is familiar. Amazon did something similar in the November 2025 Anthropic round: write the check, take the equity, lock the customer into a multi-year AWS obligation that dwarfs the original investment. The difference this time is scale. The $100 billion compute floor alone is larger than any previous cloud commitment disclosed between a hyperscaler and an AI lab.

For Amazon's first quarter call, the optics are unambiguous. AWS growth decelerated to the low 20s through 2025 as customers complained they could not get enough compute. Anthropic's revenue run-rate now sits above $30 billion, up from roughly nine billion at the end of 2025, per Anthropic's own disclosure. The compute spend that Anthropic has locked in will show up on AWS's income statement at a scale Wall Street can chart.

What Anthropic Is Actually Buying

The phrase "up to 5 gigawatts" is the part practitioners should read twice.

One gigawatt of AI compute is roughly what a small city draws from the grid. It translates to hundreds of thousands of accelerators running continuously. Anthropic's existing footprint with Amazon, a project Amazon code-named Rainier, contains nearly half a million Trainium2 chips distributed across multiple data centers. The new commitment adds roughly another half gigawatt of Trainium2 and Trainium3 by year-end, with the remainder rolling out through the early 2030s as Trainium3 and Trainium4 ramp.

Trainium is Amazon's answer to Nvidia's GPU business. The chips are designed in-house at Annapurna Labs, the Austin-based silicon arm Amazon acquired in 2015. They compete directly with Nvidia's Hopper and Blackwell families on a cost-per-token basis. Anthropic engineers have spent the last two years rewriting Claude's training stack to target Trainium, an effort that began under a smaller 2024 partnership and has now become the default path for every new Claude model.

The partnership also makes Anthropic the first foundation model lab to run at scale on all three major public clouds. Claude is live on AWS (Bedrock and the new Claude Platform on AWS), Google Cloud (Vertex AI), and Microsoft Azure (Anthropic shipped to Azure earlier this year). Over 100,000 customers are running Claude on Bedrock alone.

CommitmentFigureSource
Amazon initial cash investment$5 billionAmazon blog, April 20
Amazon milestone-triggered investmentUp to $20 billionAmazon blog, April 20
Amazon prior investment (cumulative)$8 billionAnthropic blog, April 20
Anthropic AWS spend commitmentOver $100 billion, 10 yearsAmazon + Anthropic blogs
Capacity securedUp to 5 gigawattsAnthropic blog
Trainium2 chips in use todayOver 1 millionAnthropic blog
Chips in Project RainierNearly 500,000 Trainium2Amazon blog
Anthropic valuation$380 billionCNBC, April 20
Anthropic run-rate revenueOver $30 billionAnthropic blog

How It Unfolded

SEP 2023
First Amazon investment in Anthropic
Amazon commits up to $4 billion. AWS becomes Anthropic's primary cloud provider.
MAR 2024
Amazon tops up to $4 billion
Claude goes live on AWS Bedrock. Trainium becomes a joint engineering project.
NOV 2024
Second $4 billion tranche
Amazon's total investment reaches $8 billion. Project Rainier is announced.
OCT 2025
Project Rainier comes online
Nearly half a million Trainium2 chips go live across AWS regions.
APR 16, 2026
Anthropic ships Claude Opus 4.7
Trained primarily on Trainium hardware. Anthropic discloses run-rate revenue above $30 billion.
APR 20, 2026
Expanded deal announced
$5 billion now, up to $20 billion more. Anthropic commits over $100 billion back to AWS. 5 gigawatts of capacity secured.
APR 21, 2026
Amazon stock up 2.8% premarket
Shares touch $255.26. Bank of America flags a $1.3 billion Q1 AWS revenue lift.

The Quotes That Landed Together

Andy Jassy, in Amazon's corporate announcement:

"Our custom AI silicon offers high performance at significantly lower cost for customers, which is why it's in such hot demand." — Andy Jassy, CEO of Amazon (About Amazon, April 20, 2026)

In Anthropic's own post, Jassy added a ten-year framing:

"Anthropic's commitment to run its large language models on AWS Trainium for the next decade reflects the progress we've made together on custom silicon." — Andy Jassy, CEO of Amazon (Anthropic News, April 20, 2026)

Dario Amodei did not use the word "partnership." He used the word "infrastructure":

"Our users tell us Claude is increasingly essential to how they work, and we need to build the infrastructure to keep pace." — Dario Amodei, CEO and Co-founder of Anthropic (About Amazon, April 20, 2026)

That choice of word matters. For the past year, every frontier lab has explained its capital needs the same way: the model is the product, compute is the raw material, and there is not enough raw material. Amodei's sentence fits the pattern. Compute is the constraint. Amazon is now contractually on the hook to supply it.

The Other Side

Not everyone reads a $125 billion mutual commitment as a vote of confidence. Some analysts read it as forced dependence.

Anthropic's prior investors include Google, which has put in roughly three billion dollars of its own across multiple rounds, along with a long list of sovereign wealth funds and growth-stage venture firms. Adding Amazon at the size now on the table changes the shareholder register.

The filings that will eventually surface ahead of Anthropic's long-rumored $60 billion IPO in October 2026 will have to explain what "commercial milestones" means and whether any of them can be reached without AWS's cooperation.

There is also a structural concern about chip concentration. Trainium is impressive on paper, but it has a small installed base compared with Nvidia's data center footprint, where revenue hit $51.2 billion last quarter alone. If Trainium3 or Trainium4 ships late, or if the chips underperform at the inference workloads that dominate Claude's usage, Anthropic has locked itself into a compute floor it cannot easily walk back from.

Stifel's equity analysts wrote on April 20 that Amazon "remains one of our top picks," citing the "elevated" sentiment heading into first quarter earnings. The Stifel note did not push back on the deal's size. Bank of America's was similar in tone. TipRanks tracked the consensus at "Strong Buy" with 42 of 45 covering analysts recommending the stock and none recommending a sell. That is overwhelmingly positive sell-side coverage for a deal of this scale.

The skeptical reads live further from Wall Street. CNBC columnist Deirdre Bosa, who has been tracking the "circular financing" pattern across AI infrastructure deals, has written repeatedly that these arrangements inflate reported revenues at hyperscalers while the underlying cash moves in circles between investor and customer. The Anthropic-Amazon deal is the largest example of that pattern to date.

What This Means for Practitioners

For ML engineers and AI developers, the practical takeaway is about chip availability and pricing.

Claude has been one of the hardest frontier models to get reliable inference on. Rate limits, long-context slowdowns, and capacity errors have been routine complaints on developer forums throughout 2025 and early 2026. The 5-gigawatt commitment, paired with near-term expansions of Trainium2 and Trainium3 capacity, is the first concrete answer to those complaints.

The second shift is platform exposure. Claude is now natively on AWS as a first-class service, not just through Bedrock but through a newly announced Claude Platform on AWS that removes the need for separate Anthropic credentials. For teams already building on AWS, this collapses a procurement step. For teams on Azure or Google Cloud, it raises the question of whether they are running on the best-tuned Anthropic stack.

LDS covered the backstory to this expansion earlier this month in Anthropic Tripled Its Revenue in 90 Days. Then It Signed for 3.5 Gigawatts. and Anthropic Shipped Claude Opus 4.7. It Beats GPT-5.4 on Code and Trails Its Own Unreleased Model.. The pattern is consistent. Anthropic's revenue is compounding faster than its compute, and every new announcement closes some of the gap.

It also has a competitive effect. Amazon's parallel OpenAI cloud deal, announced earlier this year, put OpenAI onto AWS infrastructure at a similar commitment level. Between the two labs, Amazon is now a financial backer of the two leading frontier AI efforts, with equity in one and a large supply contract with the other. For developers picking between Claude and GPT, the underlying silicon is increasingly the same silicon.

The Bottom Line

Amazon did not invest $25 billion in Anthropic. Amazon lent twenty-five billion worth of equity credit against one hundred billion in future purchase orders.

Anthropic did not agree to spend a hundred billion on AWS. Anthropic agreed that without that hundred billion worth of AWS, its product cannot exist at the size it is already selling.

The deal is less a milestone than a floor. Whatever happens to the AI race from here, the compute bill through 2036 is now a fixed cost for one of the two labs most likely to reach general intelligence first, and the ledger for that cost sits at Amazon.

The Takeaway

Amazon has made a thirty-three billion dollar bet that Anthropic remains the second-most-important AI lab in the world. Anthropic has made a $100 billion bet that Trainium chips can carry Claude for the next ten years. If either side is wrong, the other side's AI strategy is in trouble. If both are right, the rest of the industry just watched the compute war get two more contenders.

Sources

Practice with real Retail & eCommerce data

90 SQL & Python problems · 15 industry datasets

250 free problems · No credit card

See all Retail & eCommerce problems
Free Career Roadmaps8 PATHS

Step-by-step roadmaps from zero to job-ready — curated courses, salary data, and the exact learning order that gets you hired.

Explore all career paths