OpenAI’s $38B cloud deal with Amazon takes ChatGPT maker further beyond Microsoft

ChatGPT maker OpenAI, exercising newfound freedom under its renegotiated Microsoft partnership, will expand its cloud footprint for training and running AI models to Amazon’s infrastructure under a new seven-year, $38 billion agreement.
The deal, announced Monday, positions Amazon as a major infrastructure provider for Microsoft’s flagship AI partner, highlighting seemingly insatiable demand for computing power and increasingly complex alliances among big companies seeking to capitalize on AI.
It comes as Microsoft, Amazon, and big tech companies attempt to reassure investors who’ve grown concerned about a possible bubble in AI spending and infrastructure investment.
Under its new Amazon deal, OpenAI is slated to begin running AI workloads on Amazon Web Services’ new EC2 UltraServers, which use hundreds of thousands of Nvidia GPUs. Amazon says the infrastructure will help to run ChatGPT and train future OpenAI models.
Amazon shares rose nearly 5% in early trading after the announcement.
“Scaling frontier AI requires massive, reliable compute,” said OpenAI CEO Sam Altman in the press release announcing the deal. “Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone.”
Matt Garman, the AWS CEO, said in the release that Amazon’s cloud infrastructure will serve as “a backbone” for OpenAI’s ambitions.
In an interview with CNBC, Dave Brown, Amazon’s vice president of compute and machine learning services, said the new agreement represents “completely separate capacity” that AWS is building out for OpenAI. “Some of that capacity is already available, and OpenAI is making use of that,” Brown told CNBC.
Amazon has also been deepening its investment in AI infrastructure for Anthropic, the rival startup behind the Claude chatbot. Amazon has invested and committed a total of $8 billion in Anthropic and recently opened Project Rainier, an $11 billion data center complex for Anthropic’s workloads, running on hundreds of thousands of its custom Trainium 2 chips.
Microsoft has been expanding its own relationship with Anthropic, adding the startup’s Claude models to Microsoft 365 Copilot, GitHub Copilot, and its Azure AI Foundry platform
Up to this point, OpenAI has relied almost exclusively on Microsoft Azure for the computing infrastructure behind its large language models. The new deal announced by Microsoft and OpenAI last week revised that relationship, giving OpenAI more flexibility to use other cloud providers — removing Microsoft’s right of first refusal on new OpenAI workloads.
At the same time, OpenAI committed to purchase an additional $250 billion in Microsoft services. Microsoft still holds specific IP rights to OpenAI’s models and products through 2032, including the exclusive ability among major cloud platforms to offer OpenAI’s technology through its Azure OpenAI Service.
OpenAI’s new $38 billion deal with Amazon builds on a relationship that began earlier this year, when Amazon added OpenAI’s first open-weight models in five years to its Bedrock and SageMaker services. Released under an open-source license, those models weren’t bound by OpenAI’s exclusive API agreement with Microsoft, letting Amazon offer them on its platforms.
The latest announcement is part of a series of deals by OpenAI in recent months with companies including Oracle and Google — committing hundreds of billions of dollars overall for AI computing capacity, and raising questions about the long-term economics of the AI boom.