AI adoption in the US adds ~900,000 tons of CO₂ annually, equal to 0.02% of national emissions
image:
AI Adoption in the U.S. Adds ~900,000 Tonnes of CO₂ Annually, Equal to 0.02% of National Emissions
view moreCredit: IOP Publishing
A new study published in Environmental Research Letters finds that continued growth in artificial intelligence (AI) use across the United States could add approximately 900,000 tonnes of CO₂ annually. This is not a small amount but equates to a relatively minor increase when viewed in the context of nationwide emissions.
While AI adoption is expected to boost productivity and economic output, researchers note that its environmental footprint can be seen as relatively modest compared to other industrial activities. The study examined potential AI integration across various sectors, estimating the associated rise in energy use and carbon emissions.
Key findings include:
- AI adoption across the U.S. economy may result in an additional 896,000 tonnes of CO₂ emissions per year, which represents just 0.02% of total U.S. emissions.
- Energy use in individual industries could increase by up to 12 petajoules annually, comparable to the electricity consumption of around 300,000 U.S. homes.
Co-author Anthony R. Harding explains: “While the projected emissions from AI adoption are modest compared to other sectors, they still represent a meaningful increase. This underscores the importance of integrating energy efficiency and sustainability into AI development and deployment, especially as adoption accelerates across industries.”
As AI technologies become more integrated into daily operations, researchers encourage industry leaders to incorporate energy efficiency and sustainability into their AI strategies to ensure responsible growth as adoption scales.
ENDS
About IOPP
IOP Publishing is a society-owned scientific publisher, delivering impact, recognition and value to the scientific community. Its purpose is to expand the world of physics, offering a portfolio of journals, ebooks, conference proceedings and science news resources globally. IOPP is a member of Purpose-Led Publishing, a coalition of society publishers who pledge to put purpose above profit.
As a wholly owned subsidiary of the Institute of Physics, a not-for-profit society, IOP Publishing supports the Institute’s work to inspire people to develop their knowledge, understanding and enjoyment of physics. Visit ioppublishing.org to learn more.
Journal
Environmental Research Letters
Method of Research
Observational study
Subject of Research
Not applicable
Article Title
Watts and Bots: The Energy Implications of AI Adoption
Article Publication Date
11-Nov-2025
Goldman Sachs Says AI Boom Will Supercharge Energy Demand
- Goldman Sachs and its clients do not believe the current AI boom constitutes a speculative bubble comparable to the dot-com crash, citing real investment and valuation metrics that differ from 2000.
- The proliferation of AI data centers is projected to significantly increase U.S. electricity consumption, potentially consuming up to 9% of total U.S. generation by 2030, leading to sharply rising residential power prices.
- The article identifies several companies positioned to benefit from the AI-driven power infrastructure build-out, including Vertiv Holdings, Cameco, GE Vernova, Quanta Services, and Eaton Corporation.
Lately, big tech cheerleaders, including Meta Platforms’ (NASDAQ:META) Mark Zuckerberg, Amazon’s (NASDAQ:AMZN) Jeff Bezos and OpenAI’s CEO Sam Altman, have warned we are in an AI bubble, with stock market valuations, heavy speculation and market exuberance beginning to mirror those of the dot-com era that eventually led to the famous stock market crash 25 years ago. Meta has announced plans to spend over $100 billion on AI and infrastructure in the coming three years; Apple plans to invest more than $500 billion in the U.S. over the next four years, including AI and data-center development, while Amazon is devoting roughly $70 billion to data centers and related infrastructure this year. As the New York Times recently noted, a handful of AI companies have accounted for most of the U.S. market’s 2025 gains. Some Wall Street traders are betting that AI valuations have overshot reality: Michael Burry previously revealed a $1.1-billion short position against Nvidia (NASDAQ:NVDA) and Palantir (NASDAQ:PLTR), reflecting his view that AI stock valuations were becoming stretched.
On the other hand, another section of Wall Street remains as bullish as ever on AI. To wit, Goldman Sachs and its wealthy clients consider AI one of the biggest investment opportunities of the current century. “We did have a conversation about markets and whether or not we think we’re in a bubble,” Brittany Boals Moeller, region head of Goldman Sachs’ San Francisco PWM division, told Fortune. “We do not think we’re in a bubble, and we pay very close attention to that. Will there be some winners and losers from AI? Absolutely. There will definitely be some places where valuations are overblown, and time will tell where those spaces are. So it’s smart for clients to be diligent about how they’re investing in AI.”
To be fair, there’s real money backing the AI boom as opposed to the Dot-com boom. Nvidia is currently valued at about 33 times forward earnings, a fraction of Cisco’s (NASDAQ:CSCO) PE ratio of 200 prior to the crash. Interestingly, CSCO still trades roughly 10% below its 2000 valuation peak. A recent McKinsey study found that AI use in the corporate world could drive $4.4 trillion in productivity growth.
Goldman Sachs and its clients are closely watching AI’s implications on energy use, healthcare, and personal productivity. Previously, we reported that the proliferation of AI data centers is a major factor behind rising U.S. energy bills. After years of relatively stable prices, U.S. electricity costs have climbed sharply, amplifying financial stress for millions of consumers already coping with inflation. U.S. electricity prices have surged 36% since 2021, averaging annual increases of about 7%—triple the 12% rise seen from 2009 to 2020. Residential power prices are expected to continue climbing, with the EIA projecting an average of 17.7 cents per kilowatt-hour by 2026, up from 16 cents in 2024.
According to the Electric Power Research Institute, data centers could consume up to 9% of total U.S. electricity generation by 2030, up from around 1.5% today, driven by the rapid adoption of power-intensive technologies such as generative AI.
America’s reliance on gas-fired generation has made it more exposed to rising fuel costs. Natural gas provides about 40% of total U.S. electricity output. Gas prices have been climbing, with Henry Hub futures up nearly 60% over the past year to $4.33 per million British thermal units (MMBtu). The EIA projects that Henry Hub spot prices will average $4.90/MMBtu in 2026, up from $4.00/MMBtu in 2025, as robust LNG export demand combines with modest domestic production growth. U.S. LNG export capacity is expected to grow roughly 75% by 2030, from about 17 billion cubic feet per day (Bcf/d) now to 30 Bcf/d from sanctioned projects alone.
Previously, we highlighted several companies positioned to benefit from the AI-driven power build-out, including Vertiv Holdings Co. (NASDAQ:VRT), Cameco Corporation (NYSE:CCJ), GE Vernova (NYSE:GEV), Quanta Services Inc. (NYSE:PWR), and Eaton Corporation (NYSE:ETN), as noted here. VRT is up 64.7% year-to-date, CCJ has gained 83.5%, GEV has returned 79.1%, PWR is up 41.1%, while ETN has added 13.4%. Meanwhile, the AI boom continues to lift the broader market, with the S&P 500 up 15.6% year-to-date.
By Alex Kimani for Oilprice.com
Who Will End Up Paying for the AI Spending Spree?
- Despite denials from Washington and AI leaders, industry executives are already discussing government “backstops” and indirect support.
- OpenAI faces massive spending commitments far beyond its revenues, raising doubts about long-term financial viability.
- Subsidized data centers and rising energy costs reveal how public resources are already propping up the AI boom - and hint at a broader bailout to come.
There's an old adage in Washington: Don't believe anything until it is officially denied. Now that the Trump administration's so-called artificial intelligence (AI) czar, David Sacks, has gone on record stating that "[t]here will be no federal bailout for AI," we can begin speculating about what form that bailout might take.
It turns out that the chief financial officer of AI behemoth OpenAI has already put forth an idea regarding the form of such a bailout. Sarah Friar told The Wall Street Journal in a recorded interview that the industry would need federal guarantees in order to make the necessary investments to ensure American leadership in AI development and deployment. Friar later "clarified" her comments in a LinkedIn post after the pushback from Sacks, saying that she had "muddied" her point by using the word "backstop" and that she really meant that AI leadership will require "government playing their part." That sounds like the government should still do more or less what she said in the Wall Street Journal interview.
Now, maybe you are wondering why the hottest industry on the planet that is flush with hundreds of billions of dollars from investors, needs a federal bailout. It's revealing that AI expert and commentator Gary Marcus predicted 10 months ago that the AI industry would go seeking a government bailout to make up for overspending, bad business decisions, and huge future commitments that the industry is unlikely to be able to meet. For example, in a recent podcast hosted by an outside investor in OpenAI, the company's CEO, Sam Altman, got tetchy when asked how a company with only $13 billion in annual revenues that is running losses will somehow fulfill $1.4 trillion in spending commitments over the next few years. Altman did NOT actually answer the question.
So what possible justification could the AI industry dream up for government subsidies, loan guarantees or other handouts? For years, one of the best ways to get Washington's attention is to say the equivalent of "China bad. Must beat China." So that's what Altman is telling reporters. But that doesn't explain why OpenAI, instead of other companies, should be the target of federal largesse. In what appears to be damage control, Altman wrote on his X account that OpenAI is not asking for direct federal assistance and then later outlines how the government can give it indirect assistance by building a lot of data centers of its own (that can then presumably be leased to the AI industry so the industry doesn't have to make the investment itself).
Maybe I'm wrong, and what we are seeing is NOT the preliminary jockeying by the AI industry and the U.S. government regarding what sort of subsidy or bailout will be provided to the industry. Lest you think that the industry has so far moved forward without government handouts, the AP noted that subsidies are offered by more than 30 state governments to attract data centers. Not everyone is happy with having data centers in their communities. And, those data centers have also sent electricity rates skyward as consumers and data centers compete for electricity and utilities seek additional funds to build the capacity necessary to power those data centers. Effectively, current electricity customers are subsidizing the AI data center build-out by paying for new generating capacity and lines to feed energy to those data centers.
The larger problem with AI is that it appears to have several limitations in its current form that will prevent it from taking over much of the work already done by humans and preclude it from being incorporated into critical systems (because it makes too many mistakes). All the grandiose claims made by AI boosters are dispatched with actual facts in this very long piece by AI critic Ed Zitron.
I am increasingly thinking of AI as a boondoggle. A boondoggle, according to Dictionary.com, is "a wasteful and worthless project undertaken for political, corporate, or personal gain." So far, the AI industry mostly fits this definition. But there is a more expansive definition which I borrow from Dmitri Orlov, author of Reinventing Collapse: A contemporary boondoggle must not only be wasteful, it should, if possible, also create additional problems that can only be addressed by yet more boondoggles—such as the need for vast new electric generation capacity that will be unnecessary if AI turns out to be far less useful than advertised. AI boosters say that AI is going to have a big impact on society. I couldn't agree more, except not quite in the way these boosters think.
By Kurt Cobb via Resource Insights
The AI revolution has a power problem
By AFP
November 9, 2025
Image: — © AFP
Benjamin LEGENDRE
In the race for AI dominance, American tech giants have the money and the chips, but their ambitions have hit a new obstacle: electric power.
“The biggest issue we are now having is not a compute glut, but it’s the power and…the ability to get the builds done fast enough close to power,” Microsoft CEO Satya Nadella acknowledged on a recent podcast with OpenAI chief Sam Altman.
“So if you can’t do that, you may actually have a bunch of chips sitting in inventory that I can’t plug in,” Nadella added.
Echoing the 1990s dotcom frenzy to build internet infrastructure, today’s tech giants are spending unprecedented sums to construct the silicon backbone of the revolution in artificial intelligence.
Google, Microsoft, AWS (Amazon), and Meta (Facebook) are drawing on their massive cash reserves to spend roughly $400 billion in 2025 and even more in 2026 — backed for now by enthusiastic investors.

Easy access to electricity is posing a big challenge to the race for AI dominance, says Microsoft Chairman and CEO Satya Nadella – Copyright AFP/File Jason Redmond
All this cash has helped alleviate one initial bottleneck: acquiring the millions of chips needed for the computing power race, and the tech giants are accelerating their in-house processor production as they seek to chase global leader Nvidia.
These will go into the racks that fill the massive data centers — which also consume enormous amounts of water for cooling.
Building the massive information warehouses takes an average of two years in the United States; bringing new high-voltage power lines into service takes five to 10 years.
– Energy wall –
The “hyperscalers,” as major tech companies are called in Silicon Valley, saw the energy wall coming.
A year ago, Virginia’s main utility provider, Dominion Energy, already had a data-center order book of 40 gigawatts — equivalent to the output of 40 nuclear reactors.
The capacity it must deploy in Virginia, the world’s largest cloud computing hub, has since risen to 47 gigawatts, the company announced recently.
Already blamed for inflating household electricity bills, data centers in the United States could account for 7 percent to 12 percent of national consumption by 2030, up from 4 percent today, according to various studies.
But some experts say the projections could be overblown.
“Both the utilities and the tech companies have an incentive to embrace the rapid growth forecast for electricity use,” Jonathan Koomey, a renowned expert from UC Berkeley, warned in September.
As with the late 1990s internet bubble, “many data centers that are talked about and proposed and in some cases even announced will never get built.”
– Emergency coal –
If the projected growth does materialize, it could create a 45-gigawatt shortage by 2028 — equivalent to the consumption of 33 million American households, according to Morgan Stanley.
Several US utilities have already delayed the closure of coal plants, despite coal being the most climate-polluting energy source.
And natural gas, which powers 40 percent of data centers worldwide, according to the International Energy Agency, is experiencing renewed favor because it can be deployed quickly.
In the US state of Georgia, where data centers are multiplying, one utility has requested authorization to install 10 gigawatts of gas-powered generators.
Some providers, as well as Elon Musk’s startup xAI, have rushed to purchase used turbines from abroad to build capability quickly. Even recycling aircraft turbines, an old niche solution, is gaining traction.
“The real existential threat right now is not a degree of climate change. It’s the fact that we could lose the AI arms race if we don’t have enough power,” Interior Secretary Doug Burgum argued in October.
– Nuclear, solar, and space? –
Tech giants are quietly downplaying their climate commitments. Google, for example, promised net-zero carbon emissions by 2030 but removed that pledge from its website in June.
Instead, companies are promoting long-term projects.
Amazon is championing a nuclear revival through Small Modular Reactors (SMRs), an as-yet experimental technology that would be easier to build than conventional reactors.
Google plans to restart a reactor in Iowa in 2029. And the Trump administration announced in late October an $80 billion investment to begin construction on ten conventional reactors by 2030.
Hyperscalers are also investing heavily in solar power and battery storage, particularly in California and Texas.
The Texas grid operator plans to add approximately 100 gigawatts of capacity by 2030 from these technologies alone.
Finally, both Elon Musk, through his Starlink program, and Google have proposed putting chips in orbit in space, powered by solar energy. Google plans to conduct tests in 2027.
Lehigh University–Siemens partnership advances microgrid research for AI data centers
PhD student Saskia Putri explores real-time monitoring and control tools to help data centers manage growing power demands from artificial intelligence
Lehigh University
image:
Lehigh University PhD student Saskia Putri (civil and environmental engineering), Farrah Moazeni, assistant professor of civil and environmental engineering, and Javad Khazaei, assistant professor of electrical and computer engineering, P.C. Rossin College of Engineering and Applied Science. The team is collaborating with Siemens to advance microgrid technology for data centers.
view moreCredit: Lehigh University
Hyperscale data centers are behemoths of computing power. Spanning over a million square feet in a single facility alone, they house thousands of servers that process, store, and analyze massive datasets that drive artificial intelligence.
Not surprisingly, such infrastructure also requires an enormous amount of power.
“Data centers have distinctive load profiles,” says Saskia Putri, a fourth-year Lehigh University PhD student advised by Farrah Moazeni, an assistant professor of civil and environmental engineering in the P.C. Rossin College of Engineering and Applied Science. “When they’re training AI models, their power requirements can jump suddenly and significantly. That kind of rapid change can create instability in the electrical grid.”
To tackle this problem, Siemens sponsored a one-year research project led by Farrah Moazeni and co-led by Javad Khazaei, an assistant professor of electrical and computer engineering. As part of this agreement, Putri was awarded a year-long fellowship to work with researchers at Siemens on advancing microgrid technology for more resilient and reliable energy systems. The agreement stems from a new research collaboration between Siemens—a global leader in automation and energy systems—and Lehigh’s Center for Advancing Community Electrification Solutions (ACES).
Under the Siemens-Lehigh research agreement, one of Putri’s tasks is to develop a data center microgrid—a localized power network capable of operating independently of the main grid. Her work will focus on modeling, simulating, and embedding real-time control systems to enhance the microgrid’s stability and reliability in powering energy-intensive operations such as AI training.
“In addition to real-time control, we also aim to build a real-time monitoring tool that lets users know when a disturbance or maintenance issue is likely to occur,” she says. “The idea is to design a system that is reliable and efficient and can predict and respond to changes instantly, ensuring rapid voltage stabilization under dynamic conditions.”
Throughout the fellowship, Putri will make several visits to Siemens’ research and development department to refine her designs. She will also use resources from Moazeni’s Interconnected Critical Infrastructure Systems Engineering (CONCISE) Laboratory and the Integrated, Resilient, and Intelligent Energy Systems (INTEGRITY) Laboratory, led by Javad Khazaei. Lehigh’s own data center will serve as a testing ground for Putri’s models, allowing her to validate simulations using real operational data.
The collaboration underscores the value of mutually beneficial university-industry partnerships, a theme highlighted during a recent national workshop on industry-academia collaboration hosted by Lehigh in Washington, D.C.
For Siemens, which aims to incorporate new control and optimization algorithms into its software and hardware platforms, partnering with Lehigh bridges academic innovation and industrial application, and promises access to promising technologies.
“They focused on continuously improving their monitoring and automation tools across diverse industrial applications,” says Putri. “By collaborating with universities like Lehigh, they can bring in new ideas and research that haven’t yet been implemented in the real world.”
Such advances could translate to significant energy savings. According to the International Energy Agency, data centers already account for about 1.5 percent of global energy consumption—a figure expected to climb as AI demand grows.
“That's why we want to use a microgrid instead of relying solely on the main grid,” says Putri. “We plan to incorporate solar panels, wind turbines, and energy storage systems. By combining renewable generation with storage, we can save energy and reduce emissions.”
For Putri, the partnership offers an opportunity to address the unique challenges of modeling hyperscale data center microgrids at a time when the world is increasingly reliant on artificial intelligence.
“The design of these microgrids is not typical,” she says. “The load profile is different, the scale is massive. How can I automate that? How can I combine AI with microgrid design? These things have never been done before in the industry. I’m fascinated by how algorithms can control and optimize infrastructure to make it more resilient and sustainable—and that’s what makes this partnership so exciting.”
The experience will also give her a glimpse into her potential future. Still weighing a career in academia versus industry, Putri sees the fellowship as a chance to explore what it’s like to pursue her passion within a global company.
“When I think about going into industry, I think about companies like Siemens that prioritize research and development,” she says. “My passion is research—and if we can use what we’re learning here to make data centers greener and grids more stable, that would be amazing.”
Related Links:
- Lehigh University: Center for Advancing Community Electrification Solutions (ACES)
- Rossin College Faculty Profile: Farrah Moazeni
- Interconnected Critical Infrastructure Systems Engineering (CONCISE) Laboratory
- Rossin College Faculty Profile: Javad Khazaei
- Integrated, Resilient, and Intelligent Energy Systems (INTEGRITY) Laboratory
- Lehigh News: "Lehigh Hosts National Workshop on Rethinking PhD Training in Washington, D.C."
- Lehigh University: Institute for Cyber Physical Infrastructure and Energy
- I-CPIE News: "Doctoral Student Saskia Putri Optimizes Interconnected Water and Power Systems"
No comments:
Post a Comment