- AI data centers overwhelm national grids and drive up energy costs
- Enterprises are turning to nuclear options to sustain power-hungry AI workloads
- OpenAI calls on the government to massively expand national energy production capacity
Microsoft CEO Satya Nadella has drawn attention to a less-discussed obstacle in the AI race — a shortage not of processors, but of power.
Speaking on a podcast with OpenAI CEO Sam Altman, Nadella said Microsoft has “a bunch of chips in stock that I can’t plug in.”
“The biggest problem we have now is not an abundance of computers, but it’s power — it’s kind of the ability to get builds done fast enough close to power,” Nadella added, “in fact, that’s my problem today. It’s not a supply problem of chips; it’s actually the fact that I don’t have hot shells to plug in.”
Energy constraints are reshaping the AI landscape
Nadella explained that while the supply of GPUs is currently sufficient, the lack of adequate facilities to run them has become a critical issue.
In this context, he described “warm shells” as empty data center buildings ready to house hardware but dependent on access to adequate energy infrastructure.
This shows that the explosive growth of AI tools has exposed vulnerabilities and the demand for computing capacity has outstripped the ability to construct and operate new data center websites.
Energy planning across the tech industry is such a big issue, and even big companies like Microsoft with huge resources are still struggling to keep up.
To solve this problem, some companies, including large cloud providers, are now researching nuclear-based energy solutions to sustain their rapid expansion.
Nadella’s comments reflect broader concerns that AI infrastructure is stretching national power grids to their limits.
As data center construction accelerates across the U.S., power-intensive AI workloads are already starting to impact consumer electricity rates.
OpenAI has even called on the US government to commit to building 100 gigawatts of new electricity generation capacity annually.
It argues that energy security is becoming as important as semiconductor access in the competition with China.
Analysts have pointed out that Beijing’s head start in hydropower and nuclear development could give it an advantage in maintaining large-scale AI infrastructure.
Altman also hinted at a potential shift towards more capable consumer devices that could one day run high-end models like the GPT-5 or GPT-6 locally.
If CPU and chip innovation enables such power-saving systems, much of the projected demand for cloud-based AI processing may disappear.
This possibility poses a long-term risk to companies that invest heavily in massive data center networks.
Some experts believe such a shift could even hasten the eventual bursting of what they describe as an AI-driven economic bubble, which could threaten trillions of dollars in market capitalization if expectations were to collapse.
Via TomsHardware
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews and opinions in your feeds. Be sure to click the Follow button!
And of course you can too follow TechRadar on TikTok for news, reviews, video unboxings, and get regular updates from us on WhatsApp also.



