Friday, August 18, 2023

 

Q&A: Less plastic entering the ocean than previously believed, but it's not all good news

plastic in ocean
Credit: Unsplash/CC0 Public Domain

It seems that only about a tenth of the plastics end up in the oceans compared to what was previously thought. This is the result of a recent analysis by Mikael Kaandorp, who joined Forschungszentrum Jülich in November last year from Utrecht University in the Netherlands. The bad news is that it lingers longer and that there is currently 10 times more plastic waste floating in the sea.The study, published in the journal Nature Geoscience, helps to solve the mystery of the "missing plastic" in the oceans and received worldwide attention.

For years, researchers were faced with the question of where all the plastics that should be ending up in the oceans, according to , has gone. In the following interview, the data scientist Mikael Kaandorp from the Jülich Institute of Bio- and Geosciences (IBG-3) talks about how to interpret the new figures and the research questions he is currently pursuing.

Dr. Mikael Kaandorp, what was the motivation for this study?

In 2014 and 2015, there were widely cited studies concluding that about 8 million metric tons of plastics enter the ocean every year. But researchers could only find about 250 thousand tons at the . There is a huge mismatch. As an analogy, you can also think of a bank account. If you earn 8 million every year but only have 250 thousand in your account, you might wonder where the rest of all this money is ending up. So, the question was: Where is all the plastic? This question was also the focus of my Ph.D. project at Utrecht University.

For my recent paper we created a large, complex computer model using data assimilation techniques. This means we included as many measurements into the model as possible. We integrated for example measurements of plastics concentrations at the ocean surface and from deeper layers in the ocean, and from beach cleanups. There are a lot of measurements, we found 20,000 of them in the scientific literature.

With the help of these measurements, we tried to estimate how much  is going into the ocean and where it ends up. We came up with figures that are very different from those in the studies in 2014 and 2015. According to our model, there is only half a million tons of plastics entering the oceans every year instead of 8 million. And we think that there are more plastics at the surface, 2 million tons instead of the 250 thousand tons estimated in 2015.

Are the new numbers good or bad news?

It's kind of both. One the one hand it is good that the waste seems to be concentrated in larger pieces. These are easier to pick up, especially when they wash up on the beach. And, of course, it is good news that there seems to be much less plastics entering the oceans every year. But on the downside, we really see that the plastics have a very long lifespan in the ocean.

We know this because our model gives a quite concise picture of the mass balance. If you know how much plastics is entering the ocean every year and how much is currently floating in the ocean you can estimate how long it will stay there.

They stay there for decades. This means that the total amount is increasing, according to our estimates by 4%. In theory, the amount of plastics could double in 20 years. So, this is bad news.

What are the differences from previous studies?

Previous studies focused mainly on microplastics, because there are a thousand times more of them in the ocean than larger objects. This means, in measurements you mainly find these microplastics and miss the larger ones. I also had the opportunity to see for myself how these measurements are made. For my Ph.D., I went on a scientific cruise to the Azores in the Atlantic. The aim of the trip was to get a real-life experience of how these measurements were taken and to get a better sense of their limitations. It was really interesting to see how it works.

A crucial point of our study was the finding that larger items that exceed 2.5 cm in size make up the bulk of the mass, about 95 percent, while small microplastics are the most numerous in our ocean. This has likely been overlooked in previous studies.

But even now, there are many open questions. For example, it is not clear how fast the plastics sink into the ocean. This is a super complex problem because you can get a layer of algae on the plastics making them heavier than seawater and causing the pieces sink down. Different regions in the ocean contain different species of algae, so this process might vary from place to place. Fragmentation is also still quite uncertain. It is not clear how quickly larger pieces break down into smaller pieces. There are only some basics estimates but this rate might vary for different types of plastics which all have different properties.

What problems are you currently working on?

In Jülich I am now dealing with completely different issues. I am working on land surface modelling and land subsurface modelling which is part of the collaborative research project DETECT. I am involved in the data assimilation, where we combine numerical models with observational data. The methodology is quite similar to the previous project. The aim is to estimate how humans—through decades of land-use change and intensified  and management—have caused lasting changes in the coupled water and energy cycles of land and atmosphere; and to what extent this human activity is influencing the changing climate.

More information: Estimates of global marine plastic mass demystify the missing plastic paradox, Nature Geoscience (2023). DOI: 10.1038/s41561-023-01220-4

Mikael L. A. Kaandorp et al, Global mass of buoyant marine plastics dominated by large long-lived debris, Nature Geoscience (2023). DOI: 10.1038/s41561-023-01216-0

Harmful environmental impact of microplastics adsorbing Zinc Oxide from sunscreens and microbeads from cleansers indicated


Results confirm that mixtures of Zn-aggregates/micro-polymers were naturally leached/released from the commercial products, revealing worrying environmental implications for fish and other aquatic organisms in the food chain.


Peer-Reviewed Publication

DIAMOND LIGHT SOURCE

Fig 1 

IMAGE: SCANNING ELECTRON MICROSCOPY (SEM) ANALYSIS FOR THE ZNO NANOMATERIALS STABILISED IN TAP-WATER FOR 7 DAYS, AND FURTHER INCUBATED WITH MICROPLASTICS FOR 24 HOURS. THE LEFT SEM IMAGE IS A LOW-MAGNIFICATION VIEW, WITH THE MIDDLE IMAGE ZOOMING IN THE REGION SUBSEQUENTLY QUANTIFIED. THE ZN-SIGNAL FROM THE ENERGY DISPERSIVE X-RAY SPECTROSCOPY (EDX) IS DISPLAYED AT THE RIGHT AFTER APPLYING A DIFFERENT COLOUR MAP AND ADJUSTING THE HISTOGRAM ACCORDINGLY. view more 

CREDIT: ALL THE IMAGES ARE AN ADAPTATION OF THE PUBLISHED PAPER AT: HTTPS://ONLINELIBRARY.WILEY.COM/DOI/FULL/10.1002/GCH2.202300036.




A new study by a research team from Diamond Light Source looks at how microplastics wastes may interact with Zinc Oxide (ZnO) nanomaterials in freshwater and seawater scenarios.  It also evaluated, a ZnO-based sunscreen and an exfoliating cleanser with microbeads in its composition under the same conditions.  Their results confirm that mixtures of Zn-aggregates/micro-polymers were naturally leached/released from the commercial products revealing worrying environmental implications for  fish and other aquatic organisms in the food chain which could swallow these microplastics and ingest zinc particles at the same time.

Called  “Toward understanding the environmental risks of combined microplastics/nanomaterials exposures: Unveiling ZnO transformations after adsorption onto polystyrene microplastics in environmental solutions” the work was published on 5 July in Global Challenges  https://doi.org/10.1002/gch2.202300036 .  The team from the UK’s national synchrotron, included a student, Tatiana Da-Silva Ferreira, who was at Edinburgh University on Diamond’s 12 week 'Summer Placement' scheme.  This allows undergraduate students studying for a degree in Science, Engineering, Computing or Mathematics (who expect to gain a first or upper-second class honours degree) to gain experience working in a number of different teams at Diamond. Lead author, Miguel Gomez Gonzalez, Diamond Beamline Scientist, praised Tatiana, now studying for a PhD in Switzerland, for her key contribution to the start of this environmental project.

 Explaining the impetus for the research, Miguel said that they had all seen how in recent decades, there has been a dramatic increase in the manufacture of engineered nanomaterials (tiny, tiny particles about 1000 times thinner than a human hair), which has inevitably led to their environmental release. Similarly, Zinc oxide (ZnO) is among the more abundant nanomaterials fabricated due to its advantageous use in electronics, semiconducting, and for antibacterial purposes. At the same time, plastic waste has become ubiquitous and may break down into smaller pieces called microplastics.  These also are tiny, but ~100 times bigger than the nanomaterials. Because both these elements are getting disposed more often, they decided to study their fate when they are potentially being combined in freshwater and oceans and to help make environmental risk assessments more accurate.

To make their study more relevant to the real world, the team tested a sunscreen containing zinc oxide which is commonly used to block UV-radiation. They let the sunscreen incubate in the different environmental solutions for a week and then added the microplastics for a day. The objective was to check if the zinc oxide could come out of the sunscreen and stick to these microplastics. They also followed the same procedure with a facial scrub containing tiny plastic beads. The results clearly showed that the zinc oxide (either pure or leached from the sunscreen) did stick to the microplastic in both cases (Fig. 2), revealing that it could potentially happen in our rivers and oceans too.

 

Miguel comments; “The ability of zinc oxide, both pure nanomaterials and those released from a sunscreen, to stick to very small pieces of plastic has big implications. These plastics can even come from everyday items like exfoliating facial cleansers. In this study, we found the microplastics can carry even smaller particles of zinc from place to place. As a consequence, fish or other aquatic organisms could swallow these microplastics, ingesting zinc particles at the same time.

“We need to understand how this engineered zinc oxide changes when it gets into freshwaters and how much of it can stick to small plastic wastes. This is important for making everyone aware, from people who make these products to those who regulate them, about the potential harm they could do to our environment. Better rules for managing waste are needed, especially related to tiny particles like these. As we continue to produce more and more of these micro- and nanoparticles, their effect on our environment is going to keep growing. Because they are so long-lasting, they can pose a risk to different organisms, and ultimately even make their way into our food. This is something we simply cannot afford to ignore.”

Talking about the contribution of 2021 Summer Placement Student Tatiana Miguel highlighted the huge opportunities afforded to students by Diamond study programmes.  “Tatiana did a great job in optimising the conditions for the 7-days stabilisation of nanomaterials, followed by the 24-hours incubation of microplastics and nanomaterials. In addition, she improved the filtering protocol and isolation of the microplastics after the incubation period. Likewise, she performed the very preliminary scanning electron microscopy analysis which revealed nanomaterials adsorption into the plastic surfaces. Therefore, her contribution was key for the overall success of this environmentally relevant project,” he added. She thanked Miguel and Diamond saying; “This experience really deepened my interest in environmental chemistry and academic research.  It also gave me enough background and confidence to pursue my Masters and now my PhD.  I’m really happy I got to work on such an interesting project, and even happier you chose to look deeper into it.”

The team took some pure zinc oxide particles (ranging from 80 to 200 nm size) and incubated them in different kinds of environmental solutions for a week, allowing their natural stabilisation. They then mixed them with small polystyrene microspheres (~900 mm diameter, about the size of a grain of sand) and stirred them together for a day. After washing and rinsing the microplastics, they found that the zinc oxide got adsorbed to the plastic surfaces. This was seen by scanning electrical microscopy, using a very powerful microscope (Fig. 1). This confirmed that microplastics and zinc oxide can interact in our water bodies, which might affect how they impact the environment.

The team then examined these zinc oxide-covered microplastics using X-rays generated at Diamond Light Source, an electron accelerator facility. Diamond’s I14 beamline can shape the X-rays into a nanometric size, making it one of the best in the world for this kind of detailed work.  Fast scanning of the samples around the nanometric X-rays beam, enabled detailed pictures of each element contained in their samples to be captured by the fluorescence detector.  Alongside this work, another X-ray technique called X-ray absorption near-edge structure spectroscopy (XANES) was applied to check what kind of chemical changes had happened to the zinc oxide when adsorbing to the microplastics and after a week’s incubation in freshwaters.

Miguel adds’ “We found out that the zinc oxide had transformed into different types of zinc-related particles. Some of these new particles (Zn-sulfide) were formed quickly, while others formed more slowly but were more stable (Zn-phosphate) (Fig. 3). This reveals valuable information about how zinc oxide behaves when it is in the environment.”


. X-ray fluorescence (XRF, left) maps (100 nm pixel size) for the Zn signal and differential phase contrast (DPC, right) image for morphological inspection of the adsorbed structures measured at the hard X-ray nanoprobe (I14 beamline). Zn-particles from the sunscreen were deposited on the pristine microplastics after incubation in seawater (top row) while ZnO nanomaterials were deposited on the microplastics leached from the exfoliating cleanser after incubation in seawater as well (bottom row).

Principal component analysis [left image] and cluster analysis [middle] performed on the ZnO nanomaterials adsorbed into the microplastics surfaces after incubation in seawater. The averaged XANES spectra from the violet cluster (1) and from the red cluster (2) were subsequently analysed by linear combination fitting (red-dashed line) against the well-known standards to interrogate the Zn speciation, revealing a mixture of Zn-sulfide and Zn-phosphate species.

CREDIT

All the images are an adaptation of the published paper at: https://onlinelibrary.wiley.com/doi/full/10.1002/gch2.202300036

Our plastic waste can be used as raw material for detergents, thanks to an improved catalytic method

Peer-Reviewed Publication

UNIVERSITY OF CALIFORNIA - SANTA BARBARA



(Santa Barbara, Calif.) — We’ve managed to accumulate so much plastic trash that it’s daunting to think about what could be done with the tons upon tons of nonbiodegradable waste. And as much as we are trying to scale back our dependence on single-use plastics, we continue to add to the global plastic trash hoard. Events like the COVID-19 pandemic only served to expand their use for personal protective equipment and disposable and take-away packaging.

But, for researchers at UC Santa Barbara, one person’s single-use packaging is another person’s useful raw material. In a paper published in the journal Chem, they have reimagined the value of single-use plastics, with improvements to an innovative process that can turn polyolefins, the most common type of polymer in single-use packaging, into valuable alkylaromatics — molecules that underlie surfactants, the main components of detergents and other useful chemicals.

“If we make these surfactants from fossil fuels now and you could make them from waste plastics, then you are not using fossil fuels to make surfactants anymore, and you’re getting another use out of the carbon that went into the plastics,” said chemical engineering professor Susannah Scott, who holds UCSB’s Mellichamp Chair in Sustainable Catalytic Processing. Instead of  burning them or burying them in landfills — practices that represent the major ways we currently deal with plastic waste — plastics are repurposed in a method that shortcuts conventional “dirty” processes for making surfactants while giving single-use plastics one more shot at usefulness.

The researchers built on previous work in which they debuted a catalytic method to break the strong carbon-carbon bonds that make plastic the difficult-to-degrade material it is, then rearrange the molecular chains into alkylaromatic rings. While effective, Scott said, the original process, based on a platinum-on-alumina catalyst, was slow, and its yield of alkylaromatic molecules was low. “What we’ve done in this paper is show how to do it much better,” she said.

Key to their method is increasing the acidity of the original alumina catalyst, via the addition of chlorine or fluorine. With the added acid sites, the team was able to boost the speed and selectivity of their process.

“It just screams along,” Scott said. “It makes the alkylaromatics faster, and we can tune it to make the right-size molecules.” In the new paper, they focused on finding the optimal ratio of acid sites to metal sites in their catalyst, she explained. “It turns out they work together. They have different roles, but you need both of them to be there and in the right ratio so the catalytic cycle doesn’t get stuck at any point.”

In addition, their one-pot process operates at moderate temperatures, requiring a low energy input. While the method originally took 24 hours to turn plastic into alkylaromatic molecules, the improved process can complete the task within a couple of hours, increasing the amount of plastic that can be converted in a reasonably-sized reactor.

With further improvements, this method could be on its way to becoming a viable commercial process, according to Scott. The ultimate goal is to bring it into wide use, which would enable and incentivize the recovery of single-use plastics. Using waste plastics as a highly abundant raw material, chemical companies could take the alkylaromatic molecules resulting from this process and transform them into the surfactants that are formulated into soaps, washing liquids, cleansers and other detergents.

“Ideally you want to reuse waste plastic for a purpose with a large enough production volume, for which there is significant demand, in order to make a dent in the plastic problem,” Scott explained. To determine if this method is truly sustainable, she added, it would have to undergo a lifecycle assessment, in which the energy spent and greenhouse gasses emitted are calculated at each step. Using waste material ensures that no additional greenhouse gas emissions are produced to create the feedstock, but the energy required to run the catalytic process and separate the desired molecules would have to be factored in before scaling up, Scott said. If it passes muster, the method could displace the more fossil fuel-intensive processes that go into creating surfactants from scratch.

“We will need multiple targets to deal with the waste plastic problem, but this is a fairly big one,” Scott said. “This is worth doing.”

“Oil-eating” microbes reshape oil droplets to accelerate hydrocarbon biodegradation


Peer-Reviewed Publication

AMERICAN ASSOCIATION FOR THE ADVANCEMENT OF SCIENCE (AAAS)




“Oil-eating” Alcanivorax borkumensis bacteria form “dendritic” biofilms that reshape oil droplets to speed up the rate of consumption, researchers report. The findings reveal how this particular bacterium optimizes oil biodegradation and consumption. Obligately hydrocarbonoclastic bacteria (OHCB) survive by consuming hydrocarbons as a sole carbon and energy source. These marine bacteria are known to play an integral role in the bioremediation of spilled petroleum worldwide. A. borkumensis (or Alca), an aerobic and rod-shaped OHBC that feasts on organic acids and alkanes, bloom during oil spills to exploit the hydrocarbons contained within crude oil. During the consumption of alkanes, Alca form biofilms around oil droplets. However, the role biofilm formation plays in the degradation and consumption of hydrocarbons by this bacterium remains unclear. To address these questions, Manoj Prasad and colleagues developed a microfluidic device that allows the trapping and real-time imaging of bacteria-covered oil droplets, enabling the authors to observe the full dynamics of biofilm development from initial colonization of a surface to the complete consumption of oil droplets. Prasad et al. discovered a shift in biofilm morphology that depended on adaptations to oil consumption. In cultures new to oil exposure, Alca forms a thick spherical biofilm that grows outward from the oil, and the oil droplet remained mostly spherical as it was consumed. However, cultures that had longer exposure to oil formed a thin biofilm with finger-like protrusions the authors call “dendritic” biofilms. According to the findings, dendritic biofilms alter the oil-water interfacial tension and buckle and reshape the oil droplets as bacterial cells proliferate. This leads to an increase in the surface area of a droplet, optimizing and accelerating consumption by the ever-growing population of bacteria. Thus, Alca oil consumption efficiency is not achieved through an increase in their individual ability to metabolize oil but rather by expanding the interfacial properties of the droplet, allowing more Alca cells to feed simultaneously. “Alca alone cannot degrade the thousands of hydrocarbons in crude oil. This requires a diverse community of microbes, interacting with each other and sometimes competing,” write Terry McGenity and Pierre Laissue in a related Perspective. “Translating from single-species micro-scale interactions with surfaces to macro-scaled multi-species processes will improve our understanding of the mechanisms that drive biodegradation and transport of oil spilled into the oceans.”

To improve solar and other clean energy tech, look beyond hardware


A new study finds system deployment processes have been slow to improve over time — but must be addressed to lower clean energy costs in the future.


Peer-Reviewed Publication

MASSACHUSETTS INSTITUTE OF TECHNOLOGY




To continue reducing the costs of solar energy and other clean energy technologies, scientists and engineers will likely need to focus, at least in part, on improving technology features that are not based on hardware, according to MIT researchers. They describe this finding and the mechanisms behind it in Nature Energy.

While the cost of installing a solar energy system has dropped by more than 99 percent since 1980, this new analysis shows that “soft technology” features, such as the codified permitting practices, supply chain management techniques, and system design processes that go into deploying a solar energy plant, contributed only 10 to 15 percent of total cost declines. Improvements to hardware features were responsible for the lion’s share. 

But because soft technology is increasingly dominating the total costs of installing solar energy systems, this trend threatens to slow future cost savings and hamper the global transition to clean energy, says the study’s senior author, Jessika Trancik, a professor in MIT’s Institute for Data, Systems, and Society (IDSS).

Trancik’s co-authors include lead author Magdalena M. Klemun, a former IDSS graduate student and postdoc who is now an assistant professor at the Hong Kong University of Science and Technology; Goksin Kavlak, a former IDSS graduate student and postdoc who is now an associate at the Brattle Group; and James McNerney, a former IDSS postdoc and now senior research fellow at the Harvard Kennedy School. 

The team created a quantitative model to analyze the cost evolution of solar energy systems, which captures the contributions of both hardware technology features and soft technology features. 

The framework shows that soft technology hasn’t improved much over time — and that soft technology features contributed even less to overall cost declines than previously estimated. 

Their findings indicate that to reverse this trend and accelerate cost declines, engineers could look at making solar energy systems less reliant on soft technology to begin with, or they could tackle the problem directly by improving inefficient deployment processes.  

“Really understanding where the efficiencies and inefficiencies are, and how to address those inefficiencies, is critical in supporting the clean energy transition. We are making huge investments of public dollars into this, and soft technology is going to be absolutely essential to making those funds count,” says Trancik.

“However,” Klemun adds, “we haven’t been thinking about soft technology design as systematically as we have for hardware. That needs to change.”

The hard truth about soft costs

Researchers have observed that the so-called “soft costs” of building a solar power plant — the costs of designing and installing the plant — are becoming a much larger share of total costs. In fact, the share of soft costs now typically ranges from 35 to 64 percent.

“We wanted to take a closer look at where these soft costs were coming from and why they weren’t coming down over time as quickly as the hardware costs,” Trancik says.

In the past, scientists have modeled the change in solar energy costs by dividing total costs into additive components — hardware components and nonhardware components — and then tracking how these components changed over time. 

“But if you really want to understand where those rates of change are coming from, you need to go one level deeper to look at the technology features. Then things split out differently,” Trancik says.

The researchers developed a quantitative approach that models the change in solar energy costs over time by assigning contributions to the individual technology features, including both hardware features and soft technology features.

For instance, their framework would capture how much of the decline in system installation costs — a soft cost — is due to standardized practices of certified installers — a soft technology feature. It would also capture how that same soft cost is affected by increased photovoltaic module efficiency — a hardware technology feature. 

With this approach, the researchers saw that improvements in hardware had the greatest impacts on driving down soft costs in solar energy systems. For example, the efficiency of photovoltaic modules doubled between 1980 and 2017, reducing overall system costs by 17 percent. But about 40 percent of that overall decline could be attributed to reductions in soft costs tied to improved module efficiency.

The framework shows that, while hardware technology features tend to improve many cost components, soft technology features affect only a few. 

“You can see this structural difference even before you collect data on how the technologies have changed over time. That’s why mapping out a technology’s network of cost dependencies is a useful first step to identify levers of change, for solar PV and for other technologies as well,” Klemun notes.  

Static soft technology

The researchers used their model to study several countries, since soft costs can vary widely around the world. For instance, solar energy soft costs in Germany are about 50 percent less than those in the U.S.

The fact that hardware technology improvements are often shared globally led to dramatic declines in costs over the past few decades across locations, the analysis showed. Soft technology innovations typically aren’t shared across borders. Moreover, the team found that countries with better soft technology performance 20 years ago still have better performance today, while those with worse performance didn’t see much improvement.

This country-by-country difference could be driven by regulation and permitting processes, cultural factors, or by market dynamics such as how firms interact with each other, Trancik says.

“But not all soft technology variables are ones that you would want to change in a cost-reducing direction, like lower wages. So, there are other considerations, beyond just bringing the cost of the technology down, that we need to think about when interpreting these results,” she says.

Their analysis points to two strategies for reducing soft costs. For one, scientists could focus on developing hardware improvements that make soft costs more dependent on hardware technology variables and less on soft technology variables, such as by creating simpler, more standardized equipment that could reduce on-site installation time.

Or researchers could directly target soft technology features without changing hardware, perhaps by creating more efficient workflows for system installation or automated permitting platforms.

“In practice, engineers will often pursue both approaches, but separating the two in a formal model makes it easier to target innovation efforts by leveraging specific relationships between technology characteristics and costs,” Klemun says.

“Often, when we think about information processing, we are leaving out processes that still happen in a very low-tech way through people communicating with one another. But it is just as important to think about that as a technology as it is to design fancy software,” Trancik notes.

In the future, she and her collaborators want to apply their quantitative model to study the soft costs related to other technologies, such as electrical vehicle charging and nuclear fission. They are also interested in better understanding the limits of soft technology improvement, and how one could design better soft technology from the outset.

This research is funded by the U.S. Department of Energy Solar Energy Technologies Office.

###

Written by Adam Zewe, MIT News

 

Is data justice the key to climate justice?


Biased artificial intelligence needs human help to avoid harmful climate action, Cambridge researchers said

Peer-Reviewed Publication

UNIVERSITY OF CAMBRIDGE




Bias in the collection of data on which Artificial Intelligence (AI) computer programmes depend can limit the usefulness of this rapidly growing tool for climate scientists predicting future scenarios and guiding global action, according to a new paper by researchers at the University of Cambridge published in Nature’s npj |Climate Action series. 

AI computer programmes used for climate science are trained to trawl through complex datasets looking for patterns and insightful information. However, missing information from certain locations on the planet, time periods or societal dynamics create “holes” in the data that can lead to unreliable climate predictions and misleading conclusions.

Primary author and Cambridge Zero Fellow Dr Ramit Debnath said that individuals with access to technology, such as scientists, teachers, professionals and businesses in the Global North are more likely to see their climate priorities and perceptions reflected in the digital information widely available for AI use.

By contrast, those without the same access to technology, such as indigenous communities in the Global South, are more likely to find their experiences, perceptions and priorities missing from those same digital sources.

Debnath said: “When the information on climate change is over-represented by the work of well-educated individuals at high-ranking institutions within the Global North, AI will only see climate change and climate solutions through their eyes.”

“Biased” AI has the potential to misrepresent climate information.

For example, it could generate ineffective weather predictions or underestimate carbon emissions from certain industries, which could then misguide governments trying to create policy and regulations aimed at mitigating or adapting to climate change. 

AI-supported climate solutions which spring from biased data are in danger of harming under-represented communities, particularly those in the Global South with scant resources. These are often the same communities who also find themselves most vulnerable to the extreme weather events caused by climate change such as floods, fires, heatwaves and drought.

That is a combination which could lead to “societal tipping events”, the paper warns. 

However, these “data holes” can be filled by human knowledge. The authors advocate for a human-in-the loop design to offer AI climate change programmes with a sense check on which data is used and the context in which it is used, in an effort to improve the accuracy of predictions and the usefulness of any conclusions.

The authors mention popular AI chatbot model ChatGPT, which has recently taken the world by storm for its ability to communicate conversationally with human users. On ChatGPT, the AI can ask its human users follow-up questions, admit mistakes, challenge incorrect premises and reject inappropriate requests.

This ‘human-in-the-loop’ style AI allows bias to be noticed and corrected, the authors said. Users can input critical social information, such as existing infrastructure and market systems, to allow the AI to better anticipate any unintended socio-political and economic consequences of climate action. 

Co-author, Cambridge Zero Director and climate scientist Professor Emily Shuckburgh said: “No data is clean or without prejudice, and this is particularly problematic for AI which relies entirely on digital information.”

In highlighting the importance of globally inclusive datasets, the paper also promotes broadband internet access as a public necessity, rather than a private commodity, to engage as many users as possible in the design of AI for contemporary conversations about climate action.

The paper concludes that human-guided technology remains instrumental in the development of socially responsible AI.

Less-biased AI will be critical to our understanding of how the climate is changing, and consequently in guiding realistic solutions to mitigate and adapt to the on-going climate crisis, the authors said.

Professor Shuckburgh, who also leads the UK national research funding body’s (UKRI) Centre for Doctoral Training on the Application of AI to the study of Environmental Risks (AI4ER), said: “Only with an active awareness of this data injustice can we begin to tackle it, and consequently, to build better and more trustworthy AI-led climate solutions.”

Cost to protect globally important forests falls disproportionately on those living closest


Peer-Reviewed Publication

UNIVERSITY OF CAMBRIDGE

Wood-frame house 

IMAGE: PEOPLE LIVING NEAR THE EASTERN ARC MOUNTAINS USE POLES COLLECTED FROM THE FORESTS TO BUILD THEIR HOMES. view more 

CREDIT: MARIJE SCHAAFSMA




Local communities are not incentivised to protect tropical forests that are hugely valuable for global climate regulation, a new study has found. International funding could help smallholder farmers to boost yields on their existing land to incentivise long-term forest protection.

Biodiversity conservation delivers enormous global economic benefits, but net benefits vary widely for different groups of people - with international stakeholders gaining most, and local rural communities bearing substantial costs.

These are the findings of a decade-long study into the costs and benefits of conserving the forests and woodlands of Tanzania’s Eastern Arc Mountains - a global biodiversity hotspot estimated to provide nature-based benefits to the world equivalent to US $8.2 billion.

Climate regulation is the primary global benefit from protecting large areas of tropical forest. And although the people living near these forests feel some of this benefit, they also bear substantial conservation costs – particularly through the loss of potential income gained by clearing trees to expand farming.

“International gains from the conservation of this biodiversity hotspot far outweigh the gains to local communities directly involved in their conservation,” said Andrew Balmford, Professor of Conservation Science at the University of Cambridge and senior author of the paper.

He added: “Local rural communities are not incentivised to protect globally important natural habitats. Understandably, their need to make a basic living– which often involves clearing forests and woodlands for agriculture, timber and charcoal – has to come first.”

Intact tropical forests act as ‘carbon sinks’, removing carbon dioxide from the atmosphere and helping to regulate the global climate. Conversion of these natural habitats to agricultural land results in vast carbon emissions.

The researchers estimate that international funding of US $2 billion is needed to support people living near forests and woodlands of global importance in Tanzania’s Eastern Arc Mountains. Investments could support smallholder farmers to enhance the productivity of their existing farmland, reducing the need to clear more trees to meet the growing local demand for food.

“Investments that help farmers boost yields on their land would potentially provide a long-term solution to the pressure on natural habitats, without compromising local food production or livelihoods,” said Pantaleo Munishi, Professor of Ecosystems, Biodiversity and Climate Change Management at Sokoine University of Agriculture, Tanzania, a co-author of the study.

He added: “Farmers degrade natural resources in search of livelihood options, due to declining productivity on land they own.”

The study also found that the greatest overall global economic gains come from the most biologically important sites – but these are also most costly for locals to conserve. This means that without financial support, the incentives to clear natural habitats are highest in the most biologically important places.

“If we're serious about retaining these globally important places for society as a whole, the international community ought to be contributing more and not relying on local people to soak up these costs,” said Dr Phil Platts at the University of York and BeZero Carbon, first author of the report.

He added: “Although vastly more money is required for the conservation of key regions than the international community is spending at the moment, in the wider scheme of things it’s entirely doable.”

“The huge demand for cropland in Tanzania, not just from smallholders, leads to clearing forests and woodlands - many of which aren’t formally protected.  Smallholders are pushed further and further into the mountains because they need to make a living,” said Dr Marije Schaafsma at Vrije Universiteit Amsterdam, one of the study’s lead authors.

This is the largest and most detailed study of its type ever undertaken in the tropics. The results are published today in the journal Environmental and Resource Economics.

Tanzania’s Eastern Arc Mountains are considered a global priority for conservation, and following extensive forest clearance, local and national governments have established a network of protected areas to help to conserve the remaining biodiversity and sustain ecosystem services.

The Eastern Arc is home to almost 500 species of plants not known to exist anywhere else in the world and many unusual animals including a tree-dwelling crab, and a monkey previously unknown to western science called the kipunji.

As well as supporting biodiversity, the forests draw carbon dioxide from the atmosphere, are home to widely used medicinal plants, provide drinking water to major cities, and support national and international tourism. But conserving them also leads to loss of potential income from agriculture and timber sales, and incurs significant management costs.

“Tropical forests should be viewed as essential global infrastructure, and richer countries have a responsibility to step up and enable their conservation in an equitable way,” said Professor Brendan Fisher at the University of Vermont, a co-author of the study.