- AI requests all over the world pile pressure directly on the US electricity network
- Utilities pass the grid upgrade costs for households while data centers stop
- Projections of electricity demand show data centers tripling us consumption by 2028
The accelerating demand for computing power has pushed artificial intelligence into the center of the US energy debate.
Data centers used to support cloud services, streaming platforms and online storage, already consume large amounts of electricity, but the increase of AI tools has enlarged these needs.
According to federal projections, the proportion of national electricity use from data centers could increase from 4% by 2023 to 12% by 2028.
AIS EnergiBile intensifies demand
Since running an AI author or hosting an LLM is more energy-intensive than typical web activity, the growth curve is steep.
This extension not only changes the relationship between technology companies and utilities, but it also transforms how electricity costs are distributed over society.
Electricity awards in the United States have already risen more than 30% since 2020, and a Carnegie Mellon – North Carolina State survey warns of an additional 8% nationwide increase by 2030.
In states like Virginia, the increase could reach 25%. Utilities claim that grid upgrades are important, but the concern is who pays for them.
This is only the beginning, because when a French person asks chatgpt when the next strike is planned, Americans pay more for electricity.
How? When someone all over the world asks chatgpt an everyday question, the extra energy consumed by this inquiry is absorbed in US net demand.
This is because the Chatgpt system runs on US-based servers hosting US data centers and powered by the US electricity network.
If technology companies ensure the allocation of large capacity and delay projects, households and small businesses can be repaid by paying for unused infrastructure.
The case of unicorn interests in Virginia, where a delayed facility left nearby customers covering millions in upgrade costs, emphasizes this risk.
To address such problems, American Electric Power in Ohio suggested an interest plan that required data centers to pay for 85% of the requested capacity regardless of the actual use.
State regulators approved the measure despite resistance from cloud service providers, which instead offered a minimum of 75%.
Some companies have tried to bypass traditional utilities by generating their own power.
Amazon, Microsoft, Google and Meta already operate sustained installations, gas turbines and diesel backup generators, and some are planning nuclear facilities.
These companies not only produce electricity for their own operations, but also sell excess energy to wholesale markets, creating competition with traditional suppliers.
In recent years, such sales have generated billions, giving great cloud providers influence on both supply and price in certain regions.
The fleeting patterns of consumption for AI training that can fluctuate sharply between tops and low are another challenge.
Even a shift in demand of 10% can destabilize networks and force tools to intervene with dummy workloads.
With households that already pay more every month in some states, the concern is that consumers end up covering the cost of keeping LLM -Hosting and AI authoring systems online.
Via Toms hardware



