Even for the biggest ever public company and the most valuable start-up in history, this week’s artificial intelligence data centres deal between Nvidia and OpenAI was a blockbuster.
Nvidia, valued at $4.3tn, pledged to make the tech industry’s largest private investment into OpenAI, spending up to $100bn to fund new computing power.
As the last remaining founder-chief executive of a major tech company from before the dotcom era, Nvidia’s Jensen Huang is leveraging his commanding position in Silicon Valley like never before to ensure the AI boom endures — and his chipmaker remains at its centre.
“[$100bn] is a huge number but we are talking about a company with a market value of nearly $4.5tn,” said Michael Cusumano, professor of technological innovation and entrepreneurship at MIT’s Sloan School of Management. “That’s also unprecedented.”
It comes after a whirlwind of huge deals from Nvidia, including investing $5bn in its rival Intel last week.
Despite all the superlatives, Nvidia and OpenAI’s announcement left big uncertainties around the proposed $100bn investment.
Its unclear how quickly such huge facilities could be built and where the companies can source enough energy to run then.
OpenAI plans to lease chips from Nvidia as part of the deal, according to people with knowledge of the matter, but details of the arrangement have not been announced.
Nvidia’s decision to pump money into OpenAI to fund its need for the chipmaker’s hardware has raised concerns over the agreement’s circular structure.
Still, analysts concede Nvidia’s investment can be comfortably funded from the chipmaker’s rapidly growing cash flows — and if fully consummated the deal could drive hundreds of billions of dollars in revenue for the company.
It will also help fortify Nvidia’s position as an indispensable player in the infrastructure underpinning AI models such as OpenAI’s ChatGPT. Nvidia’s share price has surged roughly 1,000 per cent since the chatbot launched in late 2022.
However, OpenAI has recently moved to diversify its semiconductor supply chain, striking a deal with Broadcom produce custom chips.
One senior Big Tech executive said the deal highlights Nvidia’s “reliance” on OpenAI, and Huang’s desire to “head off the threat of his biggest customer building its own chip with Broadcom”.
Nvidia pushed back against any such suggestion, saying its AI infrastructure provided “an unparalleled combination of performance, versatility and value, and is available to every AI lab, cloud and enterprise”.
The relationship between Nvidia and OpenAI dates back to 2016, when Huang delivered a device he has dubbed “the first AI supercomputer the world ever made” to the AI lab when it was barely a year old.
Nine years later, Huang negotiated this week’s deal directly with Sam Altman, OpenAI’s co-founder and chief executive, said a person familiar with the matter.
The two founders worked largely without formal advice from the bankers who would normally act as intermediaries in such deals, putting the finishing touches to the agreement last week in the UK during President Donald Trump’s state visit.
The data centres — Huang has called them “AI factories” — allow OpenAI to train its AI systems and produce answers, charging customers for the output.
Huang has said that for every 1 gigawatt of AI infrastructure deployed, as much as $50bn is spent on the computing hardware, including Nvidia’s specialised processors and its own networking technology, as well as the server racks that are produced by the likes of Foxconn, HP, Dell and Super Micro.
“These are gigantic factory investments,” Huang said at an event in Taiwan in May.
Nvidia’s OpenAI deal calls for “at least” 10GW of computing power to be built, over an unspecified period. The International Energy Agency estimates 10GW of AI data centres would consume as much energy in a year as 10mn typical US households.
OpenAI said the deal is separate from the extravagant plans for Stargate, its global infrastructure project with Japan’s SoftBank and US tech group Oracle, which includes a recent $300bn contract with Oracle.
Morgan Stanley has estimated deploying 10GW of AI computing power could cost as much as $600bn, of which $350bn “potentially” goes to Nvidia.

Morgan Stanley’s analysts wrote in a note to clients: “That’s a very large number so we are not viewing this level of investment as a certainty but part of the framing of the longer-term bull case [for Nvidia stock].”
Still, the investment will only fuel the “arms race” to develop advanced AI, the analysts added. “The scale and scope of OpenAI investment is starting to dwarf all peers, but the desire to build intelligence compute remains intense.”
OpenAI on Tuesday said it has struck agreements to develop five new US data centres, pushing the cost of Stargate to about $400bn.
“Our vision is simple: we want to create a factory that can produce a gigawatt of new AI infrastructure every week,” Altman said in a blog post on Tuesday.
OpenAI is competing with Google, Meta, Elon Musk’s xAI and Anthropic in the US, as well as with Chinese rivals including DeepSeek and Alibaba.
Their infrastructure arms race continues despite persistent warning signs that the industry’s vast capital outlay is far outpacing the revenue that AI is delivering.
A report by consultancy Bain, released just hours after the Nvidia-OpenAI deal was announced, estimates AI companies need to spend $500bn on capital investment each year to meet anticipated demand by 2030.
Funding that huge outlay sustainably would require $2tn in annual revenues, Bain projected, but the industry is on pace to miss that target by some $800bn.
With such uncertainty over AI’s returns, OpenAI’s future has been clouded by questions over which groups it would get to fund its vast infrastructure projects.
Nvidia’s $100bn pledge goes some way to answering those questions, providing capital that it will receive in increments as its data centre construction progresses and making it cheaper to finance the hundreds of billions more it needs to execute on its plans.
Altman on Tuesday said he would “talk about how we are financing” OpenAI’s infrastructure ambitions “later this year”.
The deal marks a “new financing model . . . where we can pay over time, instead of buying them up front”, he added. “The chips and the systems are a humungous [percentage] of the cost and its hard to pay that upfront.”
With more than 700mn people using ChatGPT every week, OpenAI executives have said privately for months that they are already starved of the compute capacity they need to deliver such a complex product on a massive scale.
Altman is betting that “innovation is increasingly gated by access to infrastructure rather than ideas”, said Dimitri Zabelin, AI analyst at PitchBook, which tracks venture capital deals.
Huang’s move to ensure Nvidia is OpenAI’s “preferred strategic compute and networking partner”, as Monday’s announcement put it, will make it harder for AI developers to move away to rival processors.
Nvidia’s Cuda software platform, which has become the default way to write the AI software that runs on its chips, adds to the company’s grip on the industry.
The deal comes at a time when many of the chipmaker’s biggest customers — including Google, Meta, Amazon and Microsoft — are racing to develop their own custom processors as an alternative to Nvidia.
Cusumano likens Nvidia’s use of Cuda to extend its dominance to the way Microsoft and Apple gave away the tools needed to build apps for their Windows and iOS operating systems, allowing their platforms to dominate the personal computer and smartphone eras.
“The difference with Nvidia is it’s like combining Microsoft and Intel at their peak into one company,” he said. “It’s like a drug — software developers will use Nvidia’s tools and they have to use [Nvidia’s] hardware.”
Huang has continued a strategy that can be traced back directly to that first supercomputer delivery to a fledgling AI lab in 2016.
By keeping AI developers hooked on its product, Huang’s investment into OpenAI — as well as dozens of other start-ups involved in AI applications, cloud computing, robotics and healthcare — “will pay off multifold in the future for Nvidia”, Cusumano added.
Additional reporting by Michael Acton in San Francisco