Friday, May 08, 2026

AI data centres are fuelling climate destruction



The two other effects, which are closely related to what is happening around the world, is that data centres might have their own on-site energy generation, usually diesel or gas generators. There is one such facility proposed for New South Wales. Cloud Carrier is planning a 700 megawatt gas-fired power station — the same size as the gas-fired power station that was just built in NSW, which is really wild. Of course, there is no guarantee that even half of that capacity gets built, but it does show you the lack of any real environmental concern from operators and developers.

On-site gas power is a trend we have seen in the United States and other parts of the world. You can also build on-site solar and batteries, but it is not something we are seeing a lot of. When they do get built, they tend to only be a small proportion. NEXTDC, for instance, has a solar energy system on their roof and it provides 0.17% of their total energy consumption. This is not to say that solar is bad, just that these facilities consume so much more energy than a rooftop installation can provide.

The third thing, which I think is probably the most interesting, is when a data centre says: “We want to use our purchasing power, as a big customer for energy, to do a deal with a renewable energy facility and facilitate its construction.” There are a bunch of different problems with this.

First, a lot of operators will do this by purchasing certificates known as renewable energy credits. Basically, when a solar farm produces energy they can sell this as a certificate. An energy user will then buy that certificate and say they bought clean power. About a decade ago this helped: renewable energy companies used them as an incentive for investors to provide money to set up wind or solar farms. But renewable energy has become significantly cheaper, so those additional funds are not required. Today, all you are really doing is entering into a power deal with a power station, which has no effect on emissions reduction in the grid. 

When a company says they will reduce emissions from grid consumption to zero, you have to ask: what did they actually do here? It is a very weak claim and matches a lot of the debate we have seen around carbon offsetting, where there is an overly inflated sense of progress, with companies saying they are carbon neutral or have zero emissions, when in reality they are just purchasing greenwashing products. Renewable energy credits and power purchasing deals are not as bad as carbon offsetting, but they still have a lot of problems.

For example, a few days ago Amazon announced eight new projects involving power purchasing deals. A Guardian journalist asked how many of these are new facilities, to which Amazon responded all but one. I looked at these projects, and there is one very large wind farm that accounts for about 60% of the total energy, and that wind farm is not new. It is the Golden Plains Stage Two wind farm in Victoria, developed by Tag Energy, and which has been in development since the 2010s. Amazon cannot claim that the deal they are doing in 2026 is the thing that got this project over the line in 2022. But Amazon will take the wind farm’s output, put it against their grid emissions consumption — which we know makes coal and gas generate more electricity and cause more emissions — and say that is zero emissions. This, even though the windfarm would have gone ahead had Amazon not done this deal.

That has a huge effect across power systems, and across society and the discourse about climate. It ends up hugely overstating companies’ success. When you look at the numbers for the power sector in Australia, they consistently end up higher than any previous projections and much higher than targets. The power sector is meant to be doing well, but in fact it is getting worse because demand is increasing so significantly.

The idea that data centres will become a huge engine of procurement for renewables is partially true, but it is important to look at the system effect, not just at individual projects. If you are a fan of renewables you could be happy that new projects are being built. But you have to stop and think about what is the purpose of these projects. If the goal of renewable energy is to reduce emissions and get rid of fossil fuels from the power grid, then you cannot be satisfied with the scraps from the table. It is in our best interests to take a systemic view and move beyond short-term thinking.

It is great if Amazon wants to fund renewable energy — no one is stopping them. The problem comes when they claim they have no emissions impact. That is not true, and can be disproven using their own data. If you want a clean, safe power system, it is not going to help to increase renewable energy capacity two times, but increase demand five times. That is a net worsening of the system. 

What about impacts on water supply?

I am not an expert in water consumption, but there are a few things I have to note because there is much overlap with the energy and climate issues I work with.

First is what I would describe as “tactically vague promises”. A data centre needs some way of cooling; they burn through graphics processing units and other computation materials very quickly. So, you need some way to evacuate heat from the centres. The International Energy Agency reported that one server rack evacuates the heat of 30–50 gas boilers. A server rack is the size of a fridge, can weigh as much as a pickup truck, and consumes the same amount of energy as 60 households.

There are two ways you can cool a server. You can cool the air, with a lot of airconditioning, or you can have water flowing through, ensuring the heat is transferred into that water. To do the latter, you have to suck up water from a river, temporarily seize it in pipes in the centre which get hot, and then release it back into the river. So, it comes in cold and goes out hot, and generally with a bunch of contaminants from the pipes.

It is important to be clear about what water consumption means. For example, when you take a lump of coal and turn it into energy, the energy is not destroyed but that lump of coal is not recoverable in the way that water coming out of a pipe is recoverable. However, it does mean the water is temporarily seized in a facility. If a lake requires water for the local community or ecosystem, the amount in that lake is reduced. It also comes out as a different substance, hotter and with contaminants. It is less like resource consumption and more like resource change. People tend to downplay or understate the impact of water consumption, but it is not any better than consuming vast amounts of electrical energy.

There has been a meme going around that water issues are “fake”. OpenAI CEO Sam Altman said at a conference in India that he had been reading this blogger who tries to present water consumption issues as fabricated. The reason we know they are not fabricated is because of the companies’ actions. They go to great lengths to hide water consumption information: they do not report it by facility, they do not report it in a way that can be truly scrutinised, and they will fight in courts to avoid having water information disclosed. Google, quite famously, fought like mad to prevent disclosure of water consumption information.

Second is what has been described as “bluewashing”, which means presenting water consumption in a way that downplays its impacts, using what are verifiably absurd techniques. One example is that they will consume water in one spot, which comes out affected and has some material impact on an ecosystem or community. Then, the company will invest in a water treatment facility in a different town. It is very literally water offsetting: they are applying the exact same broken logic behind carbon and energy offsetting. We also see this with biodiversity offsets, of which Australia is one of the worst perpetrators.

There was a story posted by Oilfield Witness about xAI, Elon Musk’s Grok chatbot facility. He built a couple of new fossil fuel power stations to run his data centre, which is polluting a marginalised community in Memphis. My calculations show that data centre’s emissions almost undo the total emissions savings from Tesla’s deployment of electric cars. This gives you a sense of the scale of this data centre. Musk promised a new water treatment facility, but it just has not happened. Corporations know the water issue is not fake. They know it is in their interest to cover up any information they are forced to disclose with false environmental fixes.

In Australia, there is this idea of a closed loop system, in which water only needs to be taken in once, and then circulates through the system in perpetuity. Every time I see this, I ask: “What proportion of planned data centre projects in Australia have this system?” I never get a response to that question. I know I will never get a response, I will simply get blocked or ignored. You would think that if you were putting that forward as a solution, then you would know the answer to that question. If the answer is one out of 100, that is bad because you are overstating that as a solution. If it is 100 out of 100, then that is good. But I have a pretty good idea of why I never get a response to that question. They want to present it as a potential solution.

A report I did recently went through greenwashing tactics, and this fits into the “perpetual promise” category, along with carbon capture and storage, small modular nuclear reactors, geoengineering, and data centres and solar panels in space. They are not a real technological shift, they are just an idea that will never be carried out.

The water issue is really important in its own merit, and the way the industry is covering up and masking its own issues shows they know this is not a made up or manufactured concern.

These data centres are being constructed to facilitate the rapid growth in AI. Is a ‘green’ or sustainable AI even possible?

I see it in the same category as carbon capture and storage, in that you could imagine — if you sit down and strain your brain for a little while — that it is a possibility. But I think that you should not toy with that until you have examined the systemic reasons why it is not happening.

The International Energy Agency (IEA) hinted in its latest report that there are fundamental financial pressures and incentives stopping this industry from becoming anything close to efficient in terms of energy and climate. It has been developed so that you overuse it and apply it to things it does not make sense for, things that it is constantly failing at.

For example, Chat bot systems, and whatever version of them is being deployed at the moment, have been designed from the ground up to be excessively inefficient. When you put a prompt in, you often get a three paragraph response for what could have been a “yes” or a “no”, which consumes significantly more energy. The companies are forcing the product into every single corner of our digital existence. There is a push for coders to have targets for the amount of energy they have consumed. There is no other industry where maximising energy waste has been a target for performance — it is astonishing.

The latest two big things in terms of energy overconsumption are video generation, which is very, very energy intensive — many times more so than text — and agents, which is a ridiculous word to describe a chatbot that you tell to continue chatbotting when you are not looking at it. Same with coding, which is text generation on a large scale. Using AI to code uses a large amount of energy.

All this is a marker of a systemic and potentially incurable problem with the entire concept of this software system. It is much more like BitCoin than streaming: the inefficiency is baked in by design, and if it was not inefficient it would not be the thing it is today. If it was not inefficient, you might have a side tool buried in a menu in Microsoft Word that allows you to generate text. You would not have a conversation interface or have it crammed into every piece of software.

Most importantly, you would not have the ideology of technology leaders that the more energy you consume the smarter it gets, so the only way to reach a hypothetical intelligence is to use more energy. That is all bullshit, it is never going to happen, even if you were to increase energy consumption by a thousand times.

So, yes, I can imagine a way in which AI could have been a lean and effective tool built with constraints, but it has been developed as the exact opposite. To imagine a green AI, or a green chatbot system, we would have to imagine something fundamentally different to what we recognise as AI today. If AI were to actually help the climate emergency, you would not recognise it as the same product — it would be fundamentally different.

Is AI a bubble that, at some point, will pop?

I have two hot takes on this.

There is a bit of truth to that, particularly the megaprojects we are seeing in the US. There is one called the Donald J Trump Advanced Intelligence Campus, which has an 8 gigawatt fossil gas power station plant, along with some fancy nuclear technology. The CEO stepped down a few weeks ago, the share prices plummeted, and the whole thing is falling to pieces, as you might expect. But I described this in a recent piece as throwing spaghetti at a wall to see what sticks. And something does stick. 

Even if just 10% of those projects go ahead — which is what the IEA estimates — that is 10% of a very large gigawatt number of new gas emissions. You can imagine looking at that pipeline and seeing 90% of it vanishing and thinking “wow, the bubble popped”. But I look at it as several extra gigawatts of fossil gas power that will now burn, even if the data centres connecting to them are shut down. So, you have a gas boom either way, the question is only how big will the gas boom be. 

I am worried that people will see the “bubble popping” and move on, despite there being a lot to deal with, including a lot of new fossil fuel infrastructure, gas pipelines, gas-fired power stations, plus the addition of new demand to the power grid.

My second point is that the level of demand will not reduce that much. AI is being baked into society, the same way that SUVs or single-use plastics are. Once you bully and force people enough into using something, they kind of just use it. Even though they do not like it, the effort of avoiding it becomes greater than just going along with it. This is totally understandable, particularly in this period where everyone is exhausted all the time. This is a way for tech companies to ensure continued demand growth for something that nobody ever asked for.

I suspect there will be a shift to more intensive energy uses, such as agents and video production, with about 10% of users consuming 90% of the energy. Everything else will just be AI slop pummelled through digital spaces, and that will be how the industry avoids a big bubble pop. They want to become an industry that stays, in the same way that the petrochemical and fossil fuel industry correctly recognised that having a wasteful and inefficient food packaging and transport system was a great way to make money for the next five decades. This industry knows that doing the same with information will be profitable, even if it comes with rising fascism, war and climate denial. They want to stand between us and human-created information, even if it is wrong and energy-intensive to do so.

People compare it to other tech fads such as NFTs or Bitcoin, but Bitcoin power consumption was higher in 2025 than it ever has been, probably more than AI energy consumption globally, and probably entirely from fossil fuels. We do not talk about this because the Bitcoin “bubble popped”, but it did not pop; it just became more and more inefficient to get the same amount of money. But even though it is not a payment system used worldwide, and it is not in the news anymore, it is still doing vast amounts of climate damage.

I do not want generative AI to become the same thing. I want as much pressure on these industries as possible now, while it is in the media. Once it is not in the media, it will become much harder to get effort and attention paid to scrutinising these companies.

 Is Chinese AI the Remedy to Inequality?


 May 8, 2026

Photo by Bo Peng

The world is leaning on China a lot these days as a counterweight to the lunacy of Donald Trump. No one has illusions that China and its leader, Xi Jinping, are champions of democracy, but at a time when the US president is gleefully bombing boats and countries, and debating which regime to change next, China is an island of sanity.

I’ve made the analogy to Stalin in World War II, which continues to be appropriate. Roosevelt and Churchill had no illusions about Stalin’s USSR as a beacon of democracy, but they understood the essential role it played in defeating Hitler. China can play a similar role in protecting the world from the craziness emanating from the White House.

AI is one area where its role may prove to be extremely important. There have been many hugely overblown stories about how AI is going to take all the jobs and leave the rest of us unemployed and destitute.

This is an old theme about technology. Those of us who lived through the tech boom in the 1990s recall similar stories back then. There was even a boom in stories of technology-driven mass unemployment in the 1950s and 1960s. A famous novel of the time envisioned such a world in the not distant future. The fear that a new technology, in this case AI, will take all the jobs is not a new one.

Even if the prospect of mass unemployment is unlikely, there is a real concern that it will lead to even greater levels of inequality. Just to be clear, it is not the technology that creates inequality; it is the laws that govern its use. It’s unlikely that people would be making big fortunes on AI if the government didn’t grant patent and copyright monopolies to its developers.

But let’s leave that issue aside for a moment. The story of mass inequality is one where the AI makers are selling a product of enormous value that displaces millions of workers, including relatively highly paid workers. As a result, they can command huge profits from their AI.

Clearly, there is some validity to this story in that AI can displace labor in many areas, some of it highly paid. For example, AI can do much of the work in preparing legal briefs that is now done by lawyers. It’s not clear that AI will, on net, reduce the demand for lawyers, but it can substantially increase the productivity of lawyers.

But the fact that AI can lead to large gains in productivity doesn’t necessarily make the AI companies rich. That depends on the extent to which competition brings the price down.

To take an earlier technology, Dell is the largest manufacturer of computers in the United States. It is a successful and profitable company. Its market capitalization is less than $140 billion. That’s a good chunk of money, but less than 1/30th of Nvidia’s $4.8 trillion market capitalization.

The fact that the PC is an incredibly useful product that has hugely increased productivity has not meant that PC makers would get immensely rich and dominate the economy. The reason is that competition, even with weak antitrust enforcement, has forced down the price so that most of the benefits have largely gone to consumers.

This is where the Chinese AI makers come in. While the leading US makers may still be somewhat ahead by many measures, the Chinese companies are able to make AI products available to users, which likely meet most of their needs, at prices that are a fifth, a tenth, or even less than the price charged by the leading US companies.

For this reason, Chinese AI is beating out US in adoption through much of the world. Apparently, Chinese AI is even gaining many customers in Silicon Valley, both because of its lower price, but also because it is open source, which mean companies can alter it to fit their needs. This also means that a company can run the Chinese AI on their own systems and they don’t have to turn over control of sensitive company data.

This Chinese competition is a huge deal not only for bringing AI prices down, but also for preventing fascist clowns like Elon Musk from getting endless money. While Musk may always be insanely rich, if investors ever learn arithmetic and value his companies based on their profits, he will have far less money. (Tesla has a price-to-earnings ratio of 360. If it had a more normal, but still high PE of 20, Musk’s stake would be worth a bit more than 1/20th its current value.)

We should have that conversation about intellectual property rules that make the Musks of the world ridiculously rich. We should also be changing rules on things like bankruptcy that private equity barons to get rich by buying companies and putting them into bankruptcy.

Unfortunately, we have not yet advanced to the point where we can have a serious discussion on the ways we structure capitalism to generate inequality. Perhaps one day we will, but until then, we should be thankful for Chinese competition.

This first appeared on Dean Baker’s Beat the Press blog.

Dean Baker is the senior economist at the Center for Economic and Policy Research in Washington, DC. 

Plutonium Pit Bomb Production:  the Beginning of the End


 May 8, 2026

Plutonium facility at Los Alamos Labs. Photo: Department of Energy.

One portion of a gargantuan plan to modernize the entire U.S. nuclear arsenal, costing $1.5 trillion over the next twenty years, has been opened for public scrutiny and comment beginning this week.

Thanks to years-long legal challenges by environmental and community groups in California, New Mexico and South Carolina, the National Nuclear Security Administration, NNSA, was ordered by a federal district court to reveal plans for the manufacture of plutonium “pits” at the Savannah River Site (SRS) in South Carolina and Los Alamos National Laboratory in New Mexico. Citing the National Environmental Protection Act,1969, U.S. District Judge Mary Geiger Lewis, South Carolina, found that NNSA had ignored NEPA statutes, and required the Department of Energy, and its semi-autonomous nuclear weapons bureau, National Nuclear Security Administration, NNSA, to produce a Programmatic Environmental Impact Statement, PEIS, that details the manufacture, transport and waste deposition associated with plutonium pit production in Aiken, S.C. and Los Alamos, N.M.

Plutonium pits are the core of a thermonuclear weapon (hydrogen bomb). Tens of thousands of pits were manufactured during the Cold War. Pit production was concentrated almost entirely at Rocky Flats, Colorado, near Denver. The FBI raided Rocky Flats in 1989, after numerous fires, accidental plutonium releases, and whistleblower reports of dangerous working conditions at the plant. Rockwell International, the general Contractor at Rocky Flats, settled criminal charges of environmental violations for $18. 5 million (less than the bonuses it received from the government) and closed the plant in 1991. Rocky Flats was declared a Superfund site, and after costly remediation was converted into a national wildlife sanctuary. Some of the most polluted sections of Rocky Flats remain radioactive and will be sequestered forever. Communities near Rocky Flats received $375 million in compensation for increased incidents of cancer. The U.S. has manufactured very few plutonium pits since Rocky Flats closed.

Congress mandated renewed production of plutonium pits in 2015 with funding from the Defense Authorization Act. Lawmakers required the manufacture of 30 pits by this year (2026) and 80 pits per year by 2030, an entirely fanciful schedule. During the Cold War, Savannah River Site had produced plutonium but never pits, and Los Alamos National Laboratory (LANL), produces up to ten pits per year for research purposes, but has never produced pits approaching the Congressionally mandated 30 pits per year. Due to frequent accidents and safety violations, LANL has in some years produced zero pits.

NNSA’s Programmatic Environmental Impact Statement describes the intricate sequence for producing new pits for new nuclear weapons. Existing plutonium pits, around 12,000 plutonium pits, are stored at the Pantex facility in Amarillo, TX, and will be driven in specialized semi-trucks across the country on public highways to LANL and SRS. Once secured at these facilities, any oxidized impurities from aging will be removed using hot sulfuric acid and other agents. The pits are then melted, molded into spheres and machined to extremely precise dimensions. Large volumes of transuranic wastes are produced in the pit production process. Tons of transuranic wastes will be transported over public highways to the Waste Isolation Pilot Plant in Carlsbad, N.M. Radioactive waste from SRS will pass through Atlanta and follow I-20 and I-10 to the WIPP facility.

WIPP is the only facility designed to accept and store transuranic waste from nuclear weapons production. However, the New Mexico Environment Department only permitted WIPP to accept “legacy” transuranic waste from LANL, originating from the first Manhattan Project, 1942-45. NMED has not yet agreed to permit increased volumes of waste at WIPP. Plutonium waste could be stored on site at Los Alamos and Savannah River, though this would generate an entirely new set of environmental problems.

Mandated by the Defense Authorization Act of 2015, NNSA is required to produce 30 plutonium pits by this year, and 80 pits per year by 2030. SRS, slated to fabricate 50 pits per year, has never made a plutonium pit. New buildings to house the pit production in South Carolina “repurposed” a defunct mixed oxide plant. The MOX plant was designed to downblend plutonium pits from nuclear weapons decommissioned per the agreement between the U.S. and Russia to reduce their nearly 100 tons of surplus weapons-grade plutonium. While the Russians constructed and operated their MOX plant, the MOX plant at Savannah River experienced massive cost overruns and decades of delays. Putin suspended the agreement in 2016, blaming non-compliance on the part of the U.S.

The abandoned MOX plant at Savannah River 32 years behind schedule and $10 billion over budget, is 70% complete. Its conversion to the Savannah River Plutonium Pit Facility is already years behind schedule and billions of dollars over budget. Scheduled to open this year, it now is slated to make its first pit in 2035. Savannah River Site remains one of the most polluted places in the U.S. and is near the top of the EPA’s hazardous sites.

Robert Oppenheimer selected Los Alamos for the design and construction of the first fission atomic bombs dropped on Hiroshima and Nagasaki. In the decade since, LANL’s research and development of plutonium pits has created thousands of massive transuranic waste dumps on site. Plutonium has leaked into groundwater and has crossed canyons, contaminating native communities like the adjacent San Ildefonso and more distant pueblos. Plutonium is one of the most carcinogenic materials on Earth and has a half-life of 27,000 years.

LANL has never produced 30 pits per year, as mandated by Congress. Between 2007 and 2011, LANL produced 31 pits in total. Selected for its isolation and inaccessibility, LANL has chronic difficulties recruiting and retaining workers. LANL has experienced serious fires and accidents, and has been fined $16 million by the New Mexico Environment Department for neglecting the “legacy” wastes stored on site.

Whether the plutonium pit production, costing tens of billions of dollars, is even necessary, though required by Congressional statute, is contentious. NNSA’s own studies indicate that the thousands of pits stored at Pantex are viable for at least another 100 years. One study by Lawrence Livermore National Laboratory found the pits in the strategic security stockpile would be reliable for 150 years. Other classified studies about the dependability of existing plutonium pits could demonstrate the same result, and should be released.

The new plutonium pits proposed in NNSA’s Environmental Impact Statement are designed for entirely new thermonuclear weapons. The W87-1 warhead will arm the new Sentinel missile system, replacing the aging fleet of Minuteman III intercontinental missiles. The Sentinel program is years behind schedule and hundreds of billions of dollars over budget. Cost estimates for the 50 years of Sentinel deployment are over $300 billion.

Ironically, while the NEPA plutonium pit program is being presented to the public this week, the Eleventh Review of the Non-Proliferation Treaty (NPT) is ongoing at the United Nations in New York. The NPT was first ratified by 192 countries in 1970, including the U.S. The NPT is the only remaining international nuclear treaty. It calls for the right for countries to peacefully develop nuclear power reactors, and stipulates that nuclear-armed states are obligated to reduce and eventually eliminate their nuclear weapons arsenals.

NNSA’s Draft PEIS describes new plutonium pit production to be “consistent with the NPT while maintaining nuclear weapons competencies and capabilities at the weapons laboratories.”(p.1-6). The glaring dichotomy if this determination is refuted by the International Court of Justice, finding in 1996 that signatories to the NPT must adhere to

The legal import of [the NPT Article VI] obligation… goes beyond that of a mere obligation of conduct; the obligation involved here is an obligation to achieve a precise result – nuclear disarmament in all its aspects – by adopting a particular course of conduct, namely, the pursuit of negotiations on the matter in good faith.” [Para. 99]

NNSA violated the NEPA requirements to address the environmental damage of federally funded projects. The public now has an opportunity to submit comments to the NNSA until July. In particular, the plutonium pit fabrication for new nuclear weapons contravenes the Non-Proliferation Treaty despite what the draft PEIS asserts, per the decision by the ICJ.

Submit comment by email to  NEPA-SRS@srs.gov