How Much Energy Does ChatGPT’s Newest Model Consume?
- The energy consumption of the newest version of ChatGPT is significantly higher than previous models, with estimates suggesting it could be up to 20 times more energy-intensive than the first version.
- There is a severe lack of transparency regarding the energy use and environmental impact of AI models, as there are no mandates forcing AI companies to disclose this information.
- The increasing energy demands of AI are contributing to rising electricity costs for consumers and raising concerns about the broader environmental impact of the tech industry.
How much energy does the newest version of ChatGPT consume? No one knows for sure, but one thing is certain – it’s a whole lot. OpenAI, the company behind ChatGPT, hasn’t released any official figures for the large language model’s energy footprints, but academics are working to quantify the energy use for query – and it’s considerably higher than for previous models.
There are no mandates forcing AI companies to disclose their energy use or environmental impact, so most do not offer up those kinds of statistics publicly. As of May of this year, 84 percent of all large language model traffic was conducted on AI models with zero environmental disclosures.
“It blows my mind that you can buy a car and know how many miles per gallon it consumes, yet we use all these AI tools every day and we have absolutely no efficiency metrics, emissions factors, nothing,” says Sasha Luccioni, climate lead at an AI company called Hugging Face. “It’s not mandated, it’s not regulatory. Given where we are with the climate crisis, it should be top of the agenda for regulators everywhere,” she continued.
Sam Altman, the Chief Executive Officer of OpenAI, has thrown out some figures into the public sphere – saying that ChatGPT consumes 0.34 watt-hours of energy and 0.000085 gallons of water per query – but has left out key details like what model these numbers refer to, and has offered no backup or corroboration for his statements.
Experts from outside the OpenAI fold have estimated that ChatGPT-5 may use as much as 20 times more energy as the first version of ChatGPT, and at the very least uses several times more. “A more complex model like GPT-5 consumes more power both during training and during inference. It’s also targeted at long thinking … I can safely say that it’s going to consume a lot more power than GPT-4,” Rakesh Kumar, a professor at the University of Illinois, recently told The Guardian. Kumar’s current work focuses on AI’s energy consumption.
While a query to ChatGPT in 2023 would have consumed about 2 watt-hours, researchers at the University of Rhode Island’s AI lab found that ChatGPT-5 can use up to 40 watt-hours of electricity to configure a medium-length response (around 1,000 tokens). On average, they estimate that the model uses slightly over 18 watt-hours for such a response. This places ChatGPT-5 at a higher energy consumption rate than any other of the AI models they track save for two: OpenAI’s o3 reasoning model and Deepseek’s R1.
Calculating these estimated energy consumption rates was no easy feat, considering the severe lack of transparency in the sector, in spite of increasing scrutiny. “It’s more critical than ever to address AI’s true environmental cost,” University of Rhode Island professor Marwan Abdelatti told The Guardian. “We call on OpenAI and other developers to use this moment to commit to full transparency by publicly disclosing GPT-5’s environmental impact.”
While tech companies consume more and more energy each year to power their AI ambitions, common consumers are suffering the consequences. It’s consumers who are footing the bill for skyrocketing energy usage. The New York Times warns that “electricity rates for individuals and small businesses could rise sharply as Amazon, Google, Microsoft and other technology companies build data centers and expand into the energy business.” Moreover, Silicon Valley's backtracking on climate pledges will directly impact global communities, whether or not they ever use AI.
"We are witnessing a massive transfer of wealth from residential utility customers to large corporations—data centers and large utilities and their corporate parents, which profit from building additional energy infrastructure," Maryland People's Counsel David Lapp recently told Business Insider.
"Utility regulation is failing to protect residential customers, contributing to an energy affordability crisis.”
By Haley Zaremba for Oilprice.com
- The energy consumption of the newest version of ChatGPT is significantly higher than previous models, with estimates suggesting it could be up to 20 times more energy-intensive than the first version.
- There is a severe lack of transparency regarding the energy use and environmental impact of AI models, as there are no mandates forcing AI companies to disclose this information.
- The increasing energy demands of AI are contributing to rising electricity costs for consumers and raising concerns about the broader environmental impact of the tech industry.
How much energy does the newest version of ChatGPT consume? No one knows for sure, but one thing is certain – it’s a whole lot. OpenAI, the company behind ChatGPT, hasn’t released any official figures for the large language model’s energy footprints, but academics are working to quantify the energy use for query – and it’s considerably higher than for previous models.
There are no mandates forcing AI companies to disclose their energy use or environmental impact, so most do not offer up those kinds of statistics publicly. As of May of this year, 84 percent of all large language model traffic was conducted on AI models with zero environmental disclosures.
“It blows my mind that you can buy a car and know how many miles per gallon it consumes, yet we use all these AI tools every day and we have absolutely no efficiency metrics, emissions factors, nothing,” says Sasha Luccioni, climate lead at an AI company called Hugging Face. “It’s not mandated, it’s not regulatory. Given where we are with the climate crisis, it should be top of the agenda for regulators everywhere,” she continued.
Sam Altman, the Chief Executive Officer of OpenAI, has thrown out some figures into the public sphere – saying that ChatGPT consumes 0.34 watt-hours of energy and 0.000085 gallons of water per query – but has left out key details like what model these numbers refer to, and has offered no backup or corroboration for his statements.
Experts from outside the OpenAI fold have estimated that ChatGPT-5 may use as much as 20 times more energy as the first version of ChatGPT, and at the very least uses several times more. “A more complex model like GPT-5 consumes more power both during training and during inference. It’s also targeted at long thinking … I can safely say that it’s going to consume a lot more power than GPT-4,” Rakesh Kumar, a professor at the University of Illinois, recently told The Guardian. Kumar’s current work focuses on AI’s energy consumption.
While a query to ChatGPT in 2023 would have consumed about 2 watt-hours, researchers at the University of Rhode Island’s AI lab found that ChatGPT-5 can use up to 40 watt-hours of electricity to configure a medium-length response (around 1,000 tokens). On average, they estimate that the model uses slightly over 18 watt-hours for such a response. This places ChatGPT-5 at a higher energy consumption rate than any other of the AI models they track save for two: OpenAI’s o3 reasoning model and Deepseek’s R1.
Calculating these estimated energy consumption rates was no easy feat, considering the severe lack of transparency in the sector, in spite of increasing scrutiny. “It’s more critical than ever to address AI’s true environmental cost,” University of Rhode Island professor Marwan Abdelatti told The Guardian. “We call on OpenAI and other developers to use this moment to commit to full transparency by publicly disclosing GPT-5’s environmental impact.”
While tech companies consume more and more energy each year to power their AI ambitions, common consumers are suffering the consequences. It’s consumers who are footing the bill for skyrocketing energy usage. The New York Times warns that “electricity rates for individuals and small businesses could rise sharply as Amazon, Google, Microsoft and other technology companies build data centers and expand into the energy business.” Moreover, Silicon Valley's backtracking on climate pledges will directly impact global communities, whether or not they ever use AI.
"We are witnessing a massive transfer of wealth from residential utility customers to large corporations—data centers and large utilities and their corporate parents, which profit from building additional energy infrastructure," Maryland People's Counsel David Lapp recently told Business Insider.
"Utility regulation is failing to protect residential customers, contributing to an energy affordability crisis.”
By Haley Zaremba for Oilprice.com
bioEnergy Launches Modular Waste-to-Energy Microgrids for AI & Crypto Power Need
bioEnergy Development Inc. announced the commercial launch of a containerized waste-to-energy microgrid solution capable of delivering approximately 1 GWh of dispatchable electricity. The system, tested on wood residues and cattle manure, targets energy-intensive AI, high-performance computing (HPC), and crypto mining operations facing grid constraints.
Context
The company’s “waste-to-watts” model positions its bioReactor technology as a rapid-deployment alternative to emerging micro nuclear power. Each factory-built 40-foot unit is designed to be mobile, stackable, and commissionable in months rather than years. By turning biowaste into syngas, biochar, and high-grade carbon products, the system provides both clean power and durable carbon sequestration—appealing to tech giants with public net-zero commitments.
The bioreactors generate multiple revenue streams:
Syngas for behind-the-meter or grid export power
Biochar for soil performance and carbon removal credits
Biocarbon for industrial carbon replacement products
Excess biochar, blended with bio-stimulants, is aimed at the high-value carbon credit market, a sector attracting corporate buyers seeking verifiable, durable offsets.
Strategic Significance
For AI and HPC operators, the proposition is speed, cost efficiency, and resilience. The company claims operating costs are just 5–10% of micro-nuclear power generation, with the added advantage of on-site fuel security and islandable microgrid capability. The technology also eliminates waste streams from power production and supports decarbonization via methane avoidance and diesel displacement.
Market Outlook
bioEnergy is actively seeking partnerships with hyperscale data centers, colocation providers, and industrial operators for co-location pilots and long-term offtake agreements under a “power-as-a-service” or joint-venture model. Typical configurations range from 1.5 MWh to 1 GWh.
bioEnergy Development Inc. announced the commercial launch of a containerized waste-to-energy microgrid solution capable of delivering approximately 1 GWh of dispatchable electricity. The system, tested on wood residues and cattle manure, targets energy-intensive AI, high-performance computing (HPC), and crypto mining operations facing grid constraints.
Context
The company’s “waste-to-watts” model positions its bioReactor technology as a rapid-deployment alternative to emerging micro nuclear power. Each factory-built 40-foot unit is designed to be mobile, stackable, and commissionable in months rather than years. By turning biowaste into syngas, biochar, and high-grade carbon products, the system provides both clean power and durable carbon sequestration—appealing to tech giants with public net-zero commitments.
The bioreactors generate multiple revenue streams:
Syngas for behind-the-meter or grid export power
Biochar for soil performance and carbon removal credits
Biocarbon for industrial carbon replacement products
Excess biochar, blended with bio-stimulants, is aimed at the high-value carbon credit market, a sector attracting corporate buyers seeking verifiable, durable offsets.
Strategic Significance
For AI and HPC operators, the proposition is speed, cost efficiency, and resilience. The company claims operating costs are just 5–10% of micro-nuclear power generation, with the added advantage of on-site fuel security and islandable microgrid capability. The technology also eliminates waste streams from power production and supports decarbonization via methane avoidance and diesel displacement.
Market Outlook
bioEnergy is actively seeking partnerships with hyperscale data centers, colocation providers, and industrial operators for co-location pilots and long-term offtake agreements under a “power-as-a-service” or joint-venture model. Typical configurations range from 1.5 MWh to 1 GWh.

No comments:
Post a Comment