August 26, 2024
Source: Originally published by Z. Feel free to share widely.
Perhaps such activities as simple as searching on the Internet or sending an email, artificial intelligence or AI depends on data processing centers. These are sizable factory- or better warehouse-like physical structures that house computing machines and other hardware equipment.
These are often camouflaged by appearing as ordinary looking repositories containing vital computing infrastructure that IT systems require. They process each and every email and every click on a website, at the very least.
These repositories house servers, data storage drives, network equipment, and stores any company’s and a private person’s digital data.
These are corporate data centers often regarded as the ‘brain of the internet’ that house your bank data, your Amazon online shopping list, etc. Like the human brain, they need a lot of energy. And when spiced up by artificial intelligence, they need even more energy.
The electricity that these roughly 11,000 global data centers use ultimately generates heat which must be removed.
To do this, data centers must use energy – a lot of energy – in its cooling equipment. Put simply, both the running and cooling of a data center consumes a lot of energy. Internet leader USA has about 5,400 data centers.
The USA is followed – in sharp contrast – by Germany with 522 and with Frankfurt outshining the rest of Germany.
Next are the UK with 517, China with 449, Canada with 336, France with 315, Australia with 306, Netherlands with 300, Russia with 255, Japan with 219, Italy with 168, Mexico with 166, Brazil with 163, India with 152, Poland with 143, etc.
In the debate whether these are either energy spenders or rescuers of the climate, they seem to be on the side of creating energy demands.
Yet, nowadays, nothing works without a supercomputer – not even as simple as reading an online article like this one. Beyond that, artificial intelligence is making this worse.
To put AI into perspective, a single ChatGPT query requires 2.9 watt-hours of electricity while 0.3 watt-hours is used for a Google search. Not just with AI but the overall energy use of the internet is estimated to be between 84 to 143 gigawatts of electricity every year.
By comparison, Switzerland uses less than half of that. In other words, the Internet can power two Switzerlands. Globally, the Internet uses between 3.6% and 6.2% of “all” electricity. AI will crank up these numbers.
In other words, artificial intelligence is causing huge CO2 emissions. Therefore, scientists advocate a so-called ‘life cycle assessment’ of AI’s energy consumption.
The boom in AI – especially since the introduction of ChatGPT at the end of 2022 – requires more and more data centers to be able to handle these processes. And in turn, they use more and more electricity. But that is just one side of the story.
On the other hand, AI can also be used to reduce the environmental and climate impact in many areas – including the energy industry.
The goal of such a technology assessment, for example, is to ensure that the positive aspects, such as energy saving, prevail and that negative developments like the stratospheric energy use of data centers and AI, are slowed down.
Intelligent software applications have been used for many years from sorting out spam mails, to route planning in navigation systems, product recommendations based on previous purchases on the Internet and even compiling favorite playlists based on listening habits.
However, AI has accelerated this rather sharply in recent years. Meanwhile, the power requirement of AI is between three and ten times as high as in a simple Google search.
One of the reasons is that “machine learning” uses very large amounts of data to train themselves. Worse, the overarching AI maxim is: the more data used, the better the outcomes.
As a consequence, AI also requires parallel computing processes – bigger machines with more powerful microprocessors that demand more energy. Virtually all of this means an increase in the power consumption of Internet servers located in data centers.
Unsurprisingly, ChatGPT alone consumes a multiple of electricity compared to a normal Google search. It might be as much as three times.
The International Energy Agency in Paris (IEA) even calculated a factor of ten. If AI functions also make their way into normal office applications such as text and image editing programs, their power requirements will increase significantly – during both the training and the operation of the AI systems.
The training of ChatGPT (version 3) alone has an estimated 500 tons of the greenhouse gas CO. By 2026, AI may well be one of the main drivers of electricity consumption.
At the end of 2023, there were more than 8,000 data centers worldwide – most of them in the USA, followed by Germany, the United Kingdom and China as mentioned above.
Currently, their consumption is estimated at 2% to 3% of global electricity production. This could increase from 460 to up to 1050 terawatt hours by 2026.
This marks an increase that is higher than the total electricity consumption of an energy-hungry country like Germany (2023: 516 terawatt hours).
On AI’s extraordinary electricity consumption, the energy market experts at the US investment bank Goldman Sachs, for example, expects that the boom in data processing will increase the demand for electricity more than the switch to electric mobility in transportation.
The IEA calls the recent boom in data centers – that has been triggered by AI – a challenge for the power system.
Tech-historian Dan Yergin called it a hungry caterpillar that generates an insatiable demand for energy. In fact, the expansion of AI networks is hardly keeping pace with the skyrocketing demand for server capacities in many places.
In the US’ state of Virginia, for example, – one of the global hotspots of the server industry with around 500 of these giant facilities – a settlement freeze has therefore already been temporarily imposed.
Meanwhile, growth could also reach its limits in Germany’s number one region for data centers: the greater Frankfurt region with its European Internet node and its around 80 data centers.
The AI industry already consumes around 20% of electricity today, and the demand for additional capacity remains high.
German industry representatives complain that the waiting times for a grid connection in Germany are too long with some being up to five years.
German network operators have a lot to do because Germany’s transition to green energy is also increasing the demands on the grid. An ever stronger energy grid is badly needed as more heat pumps and charging stations for electric cars are coming online.
Quite apart from the massive energy consumption of AI, there are also positive potentials associated in the transition towards AI. For example, there will be a more efficient use of renewable energy. And there are additional positive potentials that come with using AI.
AI’s algorithms can be used, for example, to use the fluctuating wind and solar energy. This would make it more efficient.
It also makes it possible to optimize technical processes, such as, for example, the production, maintenance, use and recycling of products.
AI could also contribute to saving energy and resources and promote the circular economy. Yet, it is still open for debate whether the positive effects compensate for AI’s disadvantages.
But there are also new approaches on how to measure the climate footprint of AI. And for this, three levels might be distinguished: Directly: Firstly, it is about the “direct effects” that can be directly assigned to digital technology – for example, the production and use of end devices, data lines, and data centers.
Indirectly: The second topic are the “indirect effects” related to the use of digital applications or AI. In the case of online purchasing, for example, it is about packaging and delivery, while the optimization of production processes is about reduced energy requirements.
System: Thirdly, it is about “systemic effects” that affect society as a whole, such as the AI-supported changes in mobility behavior through car-sharing, and the change in the world of work generated in this way.
One might demand a more holistic life cycle assessment of AI applications. On that, society might dislike just letting technical development run its course. Under free market capitalism, this can – very quickly and very easily – go into the wrong direction.
A technology assessment and regulation based on such considerations are indispensable. Undesirable AI developments should be detected at an early stage before they become uncontrollable.
In other words, life cycle assessment of AI applications is, therefore, much needed. With a classic life cycle assessment, researchers examine the entire life cycle of a product, from raw material extraction and production, to transport and use, and eventually, to disposal.
This methodology can also be applied to digital applications, such as software and AI.
In this way, a reduction in energy consumption can then be organized in the second step. More specifically, scientists advocate that environmental product information should be provided with every digital service, for example in the form of a small data package with information on energy and resource consumption as well as emission.
The advantage: Whoever uses it – especially if it is the companies and corporations – can then use the respective CO2 evaluation of the environmental impact and such other environmental impacts.
Armed with that, the AI industry can take appropriate measures to improve the life cycle assessment and perhaps make AI environmentally sustainable.
ZNetwork is funded solely through the generosity of its readers. Donate
Thomas Klikauer
Thomas Klikauer has over 800 publications (including 12 books) and writes regularly for BraveNewEurope (Western Europe), the Barricades (Eastern Europe), Buzzflash (USA), Counterpunch (USA), Countercurrents (India), Tikkun (USA), and ZNet (USA). One of his books is on Managerialism (2013).
Perhaps such activities as simple as searching on the Internet or sending an email, artificial intelligence or AI depends on data processing centers. These are sizable factory- or better warehouse-like physical structures that house computing machines and other hardware equipment.
These are often camouflaged by appearing as ordinary looking repositories containing vital computing infrastructure that IT systems require. They process each and every email and every click on a website, at the very least.
These repositories house servers, data storage drives, network equipment, and stores any company’s and a private person’s digital data.
These are corporate data centers often regarded as the ‘brain of the internet’ that house your bank data, your Amazon online shopping list, etc. Like the human brain, they need a lot of energy. And when spiced up by artificial intelligence, they need even more energy.
The electricity that these roughly 11,000 global data centers use ultimately generates heat which must be removed.
To do this, data centers must use energy – a lot of energy – in its cooling equipment. Put simply, both the running and cooling of a data center consumes a lot of energy. Internet leader USA has about 5,400 data centers.
The USA is followed – in sharp contrast – by Germany with 522 and with Frankfurt outshining the rest of Germany.
Next are the UK with 517, China with 449, Canada with 336, France with 315, Australia with 306, Netherlands with 300, Russia with 255, Japan with 219, Italy with 168, Mexico with 166, Brazil with 163, India with 152, Poland with 143, etc.
In the debate whether these are either energy spenders or rescuers of the climate, they seem to be on the side of creating energy demands.
Yet, nowadays, nothing works without a supercomputer – not even as simple as reading an online article like this one. Beyond that, artificial intelligence is making this worse.
To put AI into perspective, a single ChatGPT query requires 2.9 watt-hours of electricity while 0.3 watt-hours is used for a Google search. Not just with AI but the overall energy use of the internet is estimated to be between 84 to 143 gigawatts of electricity every year.
By comparison, Switzerland uses less than half of that. In other words, the Internet can power two Switzerlands. Globally, the Internet uses between 3.6% and 6.2% of “all” electricity. AI will crank up these numbers.
In other words, artificial intelligence is causing huge CO2 emissions. Therefore, scientists advocate a so-called ‘life cycle assessment’ of AI’s energy consumption.
The boom in AI – especially since the introduction of ChatGPT at the end of 2022 – requires more and more data centers to be able to handle these processes. And in turn, they use more and more electricity. But that is just one side of the story.
On the other hand, AI can also be used to reduce the environmental and climate impact in many areas – including the energy industry.
The goal of such a technology assessment, for example, is to ensure that the positive aspects, such as energy saving, prevail and that negative developments like the stratospheric energy use of data centers and AI, are slowed down.
Intelligent software applications have been used for many years from sorting out spam mails, to route planning in navigation systems, product recommendations based on previous purchases on the Internet and even compiling favorite playlists based on listening habits.
However, AI has accelerated this rather sharply in recent years. Meanwhile, the power requirement of AI is between three and ten times as high as in a simple Google search.
One of the reasons is that “machine learning” uses very large amounts of data to train themselves. Worse, the overarching AI maxim is: the more data used, the better the outcomes.
As a consequence, AI also requires parallel computing processes – bigger machines with more powerful microprocessors that demand more energy. Virtually all of this means an increase in the power consumption of Internet servers located in data centers.
Unsurprisingly, ChatGPT alone consumes a multiple of electricity compared to a normal Google search. It might be as much as three times.
The International Energy Agency in Paris (IEA) even calculated a factor of ten. If AI functions also make their way into normal office applications such as text and image editing programs, their power requirements will increase significantly – during both the training and the operation of the AI systems.
The training of ChatGPT (version 3) alone has an estimated 500 tons of the greenhouse gas CO. By 2026, AI may well be one of the main drivers of electricity consumption.
At the end of 2023, there were more than 8,000 data centers worldwide – most of them in the USA, followed by Germany, the United Kingdom and China as mentioned above.
Currently, their consumption is estimated at 2% to 3% of global electricity production. This could increase from 460 to up to 1050 terawatt hours by 2026.
This marks an increase that is higher than the total electricity consumption of an energy-hungry country like Germany (2023: 516 terawatt hours).
On AI’s extraordinary electricity consumption, the energy market experts at the US investment bank Goldman Sachs, for example, expects that the boom in data processing will increase the demand for electricity more than the switch to electric mobility in transportation.
The IEA calls the recent boom in data centers – that has been triggered by AI – a challenge for the power system.
Tech-historian Dan Yergin called it a hungry caterpillar that generates an insatiable demand for energy. In fact, the expansion of AI networks is hardly keeping pace with the skyrocketing demand for server capacities in many places.
In the US’ state of Virginia, for example, – one of the global hotspots of the server industry with around 500 of these giant facilities – a settlement freeze has therefore already been temporarily imposed.
Meanwhile, growth could also reach its limits in Germany’s number one region for data centers: the greater Frankfurt region with its European Internet node and its around 80 data centers.
The AI industry already consumes around 20% of electricity today, and the demand for additional capacity remains high.
German industry representatives complain that the waiting times for a grid connection in Germany are too long with some being up to five years.
German network operators have a lot to do because Germany’s transition to green energy is also increasing the demands on the grid. An ever stronger energy grid is badly needed as more heat pumps and charging stations for electric cars are coming online.
Quite apart from the massive energy consumption of AI, there are also positive potentials associated in the transition towards AI. For example, there will be a more efficient use of renewable energy. And there are additional positive potentials that come with using AI.
AI’s algorithms can be used, for example, to use the fluctuating wind and solar energy. This would make it more efficient.
It also makes it possible to optimize technical processes, such as, for example, the production, maintenance, use and recycling of products.
AI could also contribute to saving energy and resources and promote the circular economy. Yet, it is still open for debate whether the positive effects compensate for AI’s disadvantages.
But there are also new approaches on how to measure the climate footprint of AI. And for this, three levels might be distinguished: Directly: Firstly, it is about the “direct effects” that can be directly assigned to digital technology – for example, the production and use of end devices, data lines, and data centers.
Indirectly: The second topic are the “indirect effects” related to the use of digital applications or AI. In the case of online purchasing, for example, it is about packaging and delivery, while the optimization of production processes is about reduced energy requirements.
System: Thirdly, it is about “systemic effects” that affect society as a whole, such as the AI-supported changes in mobility behavior through car-sharing, and the change in the world of work generated in this way.
One might demand a more holistic life cycle assessment of AI applications. On that, society might dislike just letting technical development run its course. Under free market capitalism, this can – very quickly and very easily – go into the wrong direction.
A technology assessment and regulation based on such considerations are indispensable. Undesirable AI developments should be detected at an early stage before they become uncontrollable.
In other words, life cycle assessment of AI applications is, therefore, much needed. With a classic life cycle assessment, researchers examine the entire life cycle of a product, from raw material extraction and production, to transport and use, and eventually, to disposal.
This methodology can also be applied to digital applications, such as software and AI.
In this way, a reduction in energy consumption can then be organized in the second step. More specifically, scientists advocate that environmental product information should be provided with every digital service, for example in the form of a small data package with information on energy and resource consumption as well as emission.
The advantage: Whoever uses it – especially if it is the companies and corporations – can then use the respective CO2 evaluation of the environmental impact and such other environmental impacts.
Armed with that, the AI industry can take appropriate measures to improve the life cycle assessment and perhaps make AI environmentally sustainable.
ZNetwork is funded solely through the generosity of its readers. Donate
Thomas Klikauer
Thomas Klikauer has over 800 publications (including 12 books) and writes regularly for BraveNewEurope (Western Europe), the Barricades (Eastern Europe), Buzzflash (USA), Counterpunch (USA), Countercurrents (India), Tikkun (USA), and ZNet (USA). One of his books is on Managerialism (2013).