U.S. Utilities Are Baffled by Phantom Data Centers
- US electric utilities are struggling to accurately forecast future power demand due to numerous speculative data center interconnection requests that may not materialize.
- The practice of AI-focused tech groups filing power requests with multiple utilities for a single potential data center project creates "phantom" demand, making accurate capacity planning difficult.
- Overestimating demand could lead to utilities overbuilding new capacity, potentially at the expense of American ratepayers who are already experiencing rising electricity prices.
America’s electric utilities are preparing for the surge in electricity demand coming with the data centers powering AI. Utilities have increased investments as they see unprecedented demand growth in the coming years after two decades of flat U.S. electricity consumption.
But they are grappling with increased levels of uncertainty because not all requests for interconnection they receive will materialize in actual data centers, necessitating electricity supply.
Phantom Data Centers
Hyperscalers and AI-focused tech groups are sounding out the utilities in the areas they are considering for future data centers, and are filing requests for interconnection of one data center with several utilities in several areas.
The huge number of requests does not paint an accurate—or full—picture of the power needs of the technology giants because companies tend to inquire about data center power supply with at least three utilities in different areas.
Of these three requests for new power capacity, only one will become a project for which agreements will be signed. Analysts and utilities cannot reliably say how much new capacity is needed, considering that one data center project pitches electricity supply requests to different utilities in different states.
After one site is picked, all the other previously proposed locations – and the interconnections – will never be built. These would be “phantom” data centers, which will never see the light of day, but which are currently haunting the projections and plans of the U.S. utilities.
So, electric utilities face a high degree of uncertainty over future revenues as the boom of AI data centers generates widely varying forecasts of peak demand in many areas across the country.
If utilities overestimate their future demand, they risk overbuilding new capacity that will not be met by consumption. A possible overbuild would come at the expense of the American ratepayers, who have already seen electricity prices rising at a faster pace than U.S. inflation over the past three years.
Puzzled Utilities
The phantom data centers and the speculative projects are making projections difficult for utilities.
For example, Sempra’s Texas-based utility Oncor said its active large commercial and industrial (LC&I) interconnection queue as of June 30, 2025, was about 38% higher than at the same time last year. As of June 30, Oncor’s active LC&I interconnection queue had 552 requests, which includes approximately 186 gigawatts (GW) from data centers and over 19 GW of load from diverse industrial sectors.
American Electric Power Company, which serves over 5 million customers in 11 states, said it now has 24 GW of firm customer commitments for incremental load by the end of the decade, up from 21 GW previously, thanks to data center growth, reshoring, and manufacturing.
“Beyond the 24 gigawatts, customers are also actively seeking to connect approximately 190 gigawatts of additional load to our system. This is five times our current system size of 37 gigawatts,” AEP president and CEO William J. Fehrman said on the Q2 earnings call.
U.S. power utilities are investing a record amount of money into transmission and grid connection. But current forecasts of AI-driven power demand vary so much that there is a massive margin of error, analysts and utility officials told Reuters Events in June.
The U.S. market faces “a moment of peak uncertainty,” according to Rebecca Carroll, Senior Director of Market Analytics at energy advisor Trio.
The latest report from the U.S. Department of Energy (DOE) puts data center consumption at anywhere between 6.7% and 12% of total U.S. electricity by 2028.
“The report estimates that data center load growth has tripled over the past decade and is projected to double or triple by 2028,” DOE said.
However, there is a huge difference between double or triple growth in data center load.
This has prompted utilities to demand clear demand estimates from data centers for future connections and power purchase agreements (PPAs), to reduce the risk of getting demand and/or prices wrong.
AI Drives U.S. Power Demand Growth
“We know not all of that is going to come online, but even a fraction of that is significant,” AEP’s chief financial officer, Trevor Mihalik, said on the earnings call.
U.S. power utilities have announced billions of dollars in capital plans for the next few years and are getting a lot of requests from commercial users, most notably Big Tech, for new power capacity in many areas next to planned data centers.
Onshoring of manufacturing activity and AI-related data centers are driving an increase in U.S. electricity consumption, Goldman Sachs said in a report earlier this year.
U.S. electrical power demand is expected to rise by 2.4% each year through 2030, with AI-related demand accounting for about two-thirds of the incremental power demand in the country, the investment bank said.
The world’s biggest economy will need all energy sources to ensure power demand is met. Natural gas is the biggest near-term winner of AI advancements, but renewables will also play a key role in powering the data centers of next-generation computing, analysts say.
By Tsvetana Paraskova for Oilprice.com
AI Energy Demand Is Soaring but Not Because of Consumer Queries
- Nearly half of U.S. electricity demand growth by 2030 will come from AI-driven data centers, with consumers absorbing higher costs.
- AI companies provide little to no transparency on energy use or emissions, leaving regulators and consumers in the dark.
- While AI could eventually offset emissions through innovation, today it is fueling both rising utility bills and climate concerns.
Artificial intelligence (AI) is eating up more and more energy all the time as large language models become increasingly complex and pervasive. In the United States, nearly half of all growth in electricity demand between now and 2030 will come from data centers, driven by the AI boom. But the problem isn’t your daily queries to ChatGPT – it’s indiscriminate AI integration in technologies and services that are far outside the end-users' control. Yet, it’s consumers who are footing the bill for soaring energy demand.
We don’t know exactly how much energy large language models are consuming, because AI companies aren’t required to disclose the information. As a result, the vast majority of them do not, and the sector is characterized by opacity when it comes to environmental impact. As of May, 84 percent of all large language model traffic was conducted on AI models with zero environmental disclosure. While many researchers are trying to calculate AI’s energy footprint, it’s a difficult task – especially because models are changing all the time, generating shifts in terms of both increased complexity and increased efficiency.
“It blows my mind that you can buy a car and know how many miles per gallon it consumes, yet we use all these AI tools every day and we have absolutely no efficiency metrics, emissions factors, nothing,” says Sasha Luccioni, climate lead at an AI company called Hugging Face. “It’s not mandated, it’s not regulatory. Given where we are with the climate crisis, it should be top of the agenda for regulators everywhere,” she went on to say.
But while we don’t know exactly how much energy AI models use, we do know that it’s a lot. “AI’s integration into almost everything from customer service calls to algorithmic “bosses” to warfare is fueling enormous demand,” reports the Washington Post. “Despite dramatic efficiency improvements, pouring those gains back into bigger, hungrier models powered by fossil fuels will create the energy monster we imagine.”
That being said, there are many things that we as consumers do each and every day that contribute far more to global greenhouse emissions. A handful of AI queries per day is negligible compared to other common and under-scrutinized practices. Watching TV and streaming videos on the internet is likely a far greater culprit of energy usage if your lifestyle is anything close to the average American’s. And your work commute is surely generating much more greenhouse gas emissions.
Put simply, the spike in energy demand from AI models is not consumers’ fault – but it is their problem. While tech companies are consuming more and more energy each year to power their AI ambitions, common consumers are footing the bill. Not only are consumers paying the literal price for AI expansion, but they will also have to bear the burden of the sector’s environmental impacts. Silicon Valley's backtracking on climate pledges, for example, will directly impact global communities, whether or not they ever benefit from AI.
"We are witnessing a massive transfer of wealth from residential utility customers to large corporations—data centers and large utilities and their corporate parents, which profit from building additional energy infrastructure," Maryland People's Counsel David Lapp recently told Business Insider. "Utility regulation is failing to protect residential customers, contributing to an energy affordability crisis.”
On the other hand, AI is gaining efficiency all the time and will be instrumental to reshaping global industries, including the energy sector, to be greener. Large language models can help advance technological breakthroughs for significant emissions gains, with noted potential for innovations in batteries and solar power. The International Energy Agency reports that increased emissions from data centers could even eventually be offset if AI is used to lower emissions from other sectors.
We’re currently in the messy exploration stages of a global transformation, and the up-front costs in terms will be – and already are – high. Training large language models is incredibly energy- and resource-intensive. But as AI advances, we will get much better at learning how to optimize it, and it could be a net benefit – even in terms of emissions – further down the road. But until then, consumers will be paying the price.
By Haley Zaremba for Oilprice.com
No comments:
Post a Comment