By Dr. Tim Sandle
February 26, 2025

Brussels spent most of last year racing to greenlight the draft text after chatbot ChatGPT exploded onto the scene - Copyright AFP/File OLIVIER MORIN
The environmental impact of training and maintaining large language models like ChatGPT is significant.
A new report from Business Energy UK has unearthed some facts that highlight the environmental impact of AI. The focus is with ChatGPT. The survey indicates that the Large Language Model uses 19.58 million gallons of water and 19.99 million kWh of electricity every day—that’s enough to charge 4 million phones and flush 12 million toilets daily.
This equates to:
Electricity 19.99 million kWh per day – Enough to charge 4 million phones.
– Enough to run the Empire State Building for 270 days.
Water 19.58 million gallons (71.14 million liters) per day – Could fill 489,000 baths—that would be enough for every resident in Colorado Springs, CO.
– Equivalent to 12 million toilet flushes—or more than the entire population of Belgium flushing their toilet at once.
Large language models are the algorithmic basis for chatbots like OpenAI’s ChatGPT and Google’s Bard. The technology is tied back to billions — even trillions — of parameters that can make them both inaccurate and non-specific for vertical industry use. This includes the computational power of specialized hardware accelerators.

OpenAI’s ChatGPT fronted generative AI’s emergence into public consciousness – Copyright AFP Kirill KUDRYAVTSEV
By using data from the University of California and the Washington Post, which found that one ChatGPT response consumes 519 milliliters of water (just over 2 cups) and 0.14 kWh of electricity, the researchers calculated the platform’s daily, weekly, monthly, and yearly resource consumption across its 28 million daily users.
Furthermore, these types of generative AI training clusters consume seven or eight times more energy than a typical computing workload. As an example, the water Microsoft used to cool its US-based data centers while training GPT-4 was enough to produce “370 BMW cars or 320 Tesla electric vehicles.” Temperature control of the Microsoft computers relies heavily on water cooling, drawing from local rivers to keep servers running at optimal temperatures.
Taking the findings and extrapolating these for annual energy consumption, this reveals the following about OpenAI’s ChatGPT’s environmental impact:
By using data from the University of California and the Washington Post, which found that one ChatGPT response consumes 519 milliliters of water (just over 2 cups) and 0.14 kWh of electricity, the researchers calculated the platform’s daily, weekly, monthly, and yearly resource consumption across its 28 million daily users.
Furthermore, these types of generative AI training clusters consume seven or eight times more energy than a typical computing workload. As an example, the water Microsoft used to cool its US-based data centers while training GPT-4 was enough to produce “370 BMW cars or 320 Tesla electric vehicles.” Temperature control of the Microsoft computers relies heavily on water cooling, drawing from local rivers to keep servers running at optimal temperatures.
Taking the findings and extrapolating these for annual energy consumption, this reveals the following about OpenAI’s ChatGPT’s environmental impact:
Electricity 7.23 billion kWh per year – More electricity than 112 countries consume in a year.
– Enough to power every house in Wyoming (271,887) for two and a half years.
Water 7.14 billion gallons per year – Enough water to fill the Central Park Reservoir seven times (1 billion gallons).
Enough water to power the shows of the Fountains of Bellagio for 595 years.
Such findings add to prevailing concerns about the environmental impact of AI and its massive computational power requirements together with the increasing energy demands of data centers.
No comments:
Post a Comment