Tuesday, September 17, 2024


GUEST ESSAY

Is AI going to use too much energy? It’s a cost vs benefit question


The energy projections for the seemingly insatiable needs of AI seem to change as fast as they are published. (Image: Ideogram.ai)

By Steven Boykey Sidley

17 Sep 2024 

We need to ask: Is AI’s excessive energy use worth the value we receive after discounting for the risk?

Once upon a time, if an inventor or a company came up with some fabulous technology that promised to make the world a better place, there were a few things they had to think about, like where they were going to get the money to turn the invention into a product that people wanted to buy, how it could be made affordable, how they were going to tell everyone about it and how they were going to get it to the buyers.

One thing they never really had to worry too much about was where the energy was going to come from to make the product or keep it operating. I mean, yeah, they had to factor in energy costs and all, but energy was in abundant supply for new products and technologies – there was coal for trains, oil for vehicles and ships and aeroplanes, batteries for small appliances and a wide variety of energy delivery systems for national electrification. It was enough of a challenge for inventors and innovators to make the products and get them to market. At least they didn’t have to fret about the politics, the environmental impact or the scarcity of energy supplies.

Obviously, things have changed. Fossil fuels are yesterday’s solution for all the well-known reasons. Nuclear energy is expensive and (mostly) location-bound. Batteries are (in the words of that Tesla guy) “sucky”, and renewables are still toddlers.

OK, so let’s consider what has happened to the sparky business of invention during the past three or four decades when the information age became, well, everything and energy use became an important consideration in the public mind.


Projections about technology and energy use have long suffered from both poor methodology and alarmism. ​​An article published in June 1999 by Forbes (“Dig more coal – the PCs are coming”) reported that it took a “pound of coal to move 2Mb” and went on to sound energy alarms at claxon decibels. A report from the Boston Consulting Group as late as 2004 proclaimed grimly that the internet used 10% of the world’s electricity and was responsible for 2% of its carbon emissions, and then warned of worse to come.

There were (and still are) many of these projections in circulation, all equally dire, all suffering from the same intrinsic flaw. They generally conclude that “at the current rate of growth, the following catastrophes will befall us…” This approach has always suffered from a linear projection of too few variables applied to an ecosystem which is wildly multivariate. The “current rate of growth” is an assumption built on thin ice.


In any event, beyond the internet, poorly thought-out energy consumption extrapolations continue to find their way into the literature, and worse, into the popular press. Cloud computing was the next threat thought likely to diminish energy stability, except that we decided, as a digitally connected global society, that we liked our always-on apps more than we worried about their energy consumption.

Then came Bitcoin and its crypto-ilk, which were also slammed by a very vocal cohort of energy-worriers. The alarmism about crypto energy use has, quite predictably, gone quiet over the past year or two. Not only have the mining hardware rigs become more energy efficient, but the Bitcoin mining industry is going green, not out of concern for the environment but because it has found renewable energy sources to be less expensive and more convenient than older energy generation technologies.

Which, of course, brings us to AI, at least in its most recent incarnation since the arrival of ChatGPT in 2022.

The nature of generative AI, where much of the industry’s focus now lies, requires a great deal of data to be trained by means of increasingly complex statistical tools which drive up the cost of computation. OpenAI, Google, Meta, xAI, Anthropic and others need to continuously scale up their compute capacity to meet the demands of larger datasets and hungrier training regimes. Those who can’t commit the requisite computational resources will drop out of the race.

How scary are the energy projections for the seemingly insatiable needs of AI?

The numbers and projections seem to change as fast as they are published – the speed of innovation and the urgency of the race means that nobody actually has a real handle on the numbers – but let’s take a stab at two well-respected studies.

According to a 24 April Goldman Sachs report (“AI, data centers and the coming US power demand surge”), the demand by data centres for energy in the US will increase from 2.4% of all the country’s energy production to 9% by 2030, most of it driven by the demand for AI computation.

Other projections are even more extreme. A recent Prudential Investment Management study had AI-led energy use exceeding 20% of the world’s energy production by 2030.

Both of these are astonishingly steep increases, made more remarkable in the light of the 0% increase in demand for data centre energy during the previous 10 years.

All of the furious arguments about whether AI’s (seemingly insatiable) energy demands are worth the price that must be paid – environmental, geopolitical, risk, ROI, job losses – boil down to a cost vs benefit analysis. To put it more simply, do we believe that the excessive energy use of AI is worth the value we receive after discounting for the risk?

So, here is the pop quiz that will help you answer the question whether AI will require too much energy use:

Is AI going to cure cancer?

Is AI going to kill us all?

Is AI going to discover new mathematics and physics?

Is AI going to turbocharge misinformation?

Is AI going to increase food production?

Is AI going to help humans kill other humans at scale?

Is AI going to help educate our children?

Is AI going to destroy democracy?

Is AI going to make great art?

And… will AI invent transformative new energy technologies?

My answer to all of these questions is: Yeah, probably.

I hope that clears things up. DM

Steven Boykey Sidley is a professor of practice at JBS, University of Johannesburg. His new book It’s Mine: How the Crypto Industry is Redefining Ownership is published by Maverick451 in South Africa and Legend Times Group in the UK/EU, available now.

No comments:

Post a Comment