- OpenAI invests $500 billion in Stargate, funding massive AI data centers
- Each Stargate site receives a community plan tailored to local needs
- Cloud hosting and web hosting can benefit from predictable operating energy costs
OpenAI has unveiled a plan aimed at limiting the impact of its Stargate data centers on local electricity costs.
The new guidelines will see each site operate under a community plan developed with input from residents and regulators.
This approach includes financing new power and storage infrastructure directly or investing in energy generation and transmission resources as needed.
Electricity investments aim to ease the local energy load
The goal is to ensure that local utility bills do not increase due to the operation of these large data centers.
The Stargate initiative is a multi-year, $500 billion program to build AI data centers across the United States to support both AI training and inference workloads and handle some of the most demanding computational tasks in the industry.
OpenAI’s efforts mirror those of other tech companies, such as Microsoft, which recently announced measures to reduce water use and limit the impact of electricity costs in its own data centers.
By financing energy infrastructure and working closely with local utilities, these companies aim to prevent additional financial burdens on surrounding communities.
Each Stargate site will have a customized plan that reflects the specific energy requirements of its location.
This could involve financing the installation of additional energy storage systems or expanding local production capacity.
OpenAI claims it will fully cover energy costs resulting from its operation, rather than passing them on to residents or businesses.
Cloud hosting and web hosting on these sites should benefit from predictable operating costs, while AI tools can run at scale without disrupting on-premises infrastructure.
Reports suggest that AI-powered data centers could nearly triple US electricity demand by 2035, putting pressure on regional power grids and pushing utility bills higher for consumers.
US lawmakers have criticized technology companies for relying on public utilities while residential customers and small businesses absorb the costs of grid upgrades.
Volatile demand from AI workloads, such as running large language models or other cloud-based AI services, further complicates energy planning.
Without proactive investment, electricity costs can rise sharply in regions that host multiple data centers.
OpenAI’s community plan also reflects the growing challenge of energy access for AI development.
Large-scale AI tools use far more power than typical cloud services or web hosting workloads, making infrastructure planning important.
By directly funding energy improvements and coordinating with local utilities, OpenAI aims to reduce risks to both the power grid and nearby communities.
Via Bloomberg
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews and opinions in your feeds. Be sure to click the Follow button!
And of course you can too follow TechRadar on TikTok for news, reviews, video unboxings, and get regular updates from us on WhatsApp also.



