Friday, February 17, 2023

How a record-breaking copper catalyst converts CO2 into liquid fuels

Researchers at Berkeley Lab have made real-time movies of copper nanoparticles as they evolve to convert carbon dioxide and water into renewable fuels and chemicals. Their new insights could help advance the next generation of solar fuels

Peer-Reviewed Publication

DOE/LAWRENCE BERKELEY NATIONAL LABORATORY

Featured Image 

IMAGE: ARTIST’S RENDERING OF A COPPER NANOPARTICLE AS IT EVOLVES DURING CO2 ELECTROLYSIS: COPPER NANOPARTICLES (LEFT) COMBINE INTO LARGER METALLIC COPPER “NANOGRAINS” (RIGHT) WITHIN SECONDS OF THE ELECTROCHEMICAL REACTION, REDUCING CO2 INTO NEW MULTICARBON PRODUCTS. view more 

CREDIT: YAO YANG/BERKELEY LAB

Since the 1970s, scientists have known that copper has a special ability to transform carbon dioxide into valuable chemicals and fuels. But for many years, scientists have struggled to understand how this common metal works as an electrocatalyst, a mechanism that uses energy from electrons to chemically transform molecules into different products. 

Now, a research team led by Lawrence Berkeley National Laboratory (Berkeley Lab) has gained new insight by capturing real-time movies of copper nanoparticles (copper particles engineered at the scale of a billionth of a meter) as they convert CO2 and water into renewable fuels and chemicals: ethylene, ethanol, and propanol, among others. The work was reported in the journal Nature last week. 

“This is very exciting. After decades of work, we’re finally able to show – with undeniable proof – how copper electrocatalysts excel in CO2 reduction,” said Peidong Yang, a senior faculty scientist in Berkeley Lab’s Materials Sciences and Chemical Sciences Divisions who led the study. Yang is also a professor of chemistry and materials science and engineering at UC Berkeley. “Knowing how copper is such an excellent electrocatalyst brings us steps closer to turning CO2 into new, renewable solar fuels through artificial photosynthesis.”

The work was made possible by combining a new imaging technique called operando 4D electrochemical liquid-cell STEM (scanning transmission electron microscopy) with a soft X-ray probe to investigate the same sample environment: copper nanoparticles in liquid. First author Yao Yang, a UC Berkeley Miller postdoctoral fellow, conceived the groundbreaking approach under the guidance of Peidong Yang while working toward his Ph.D. in chemistry at Cornell University. 

Scientists who study artificial photosynthesis materials and reactions have wanted to combine the power of an electron probe with X-rays, but the two techniques typically can’t be performed by the same instrument. 

Electron microscopes (such as STEM or TEM) use beams of electrons and excel at characterizing the atomic structure in parts of a material. In recent years, 4D STEM (or “2D raster of 2D diffraction patterns using scanning transmission electron microscopy”) instruments, such as those at Berkeley Lab’s Molecular Foundry, have pushed the boundaries of electron microscopy even further, enabling scientists to map out atomic or molecular regions in a variety of materials, from hard metallic glass to soft, flexible films. 

On the other hand, soft (or lower-energy) X-rays are useful for identifying and tracking chemical reactions in real time in an operando, or real-world, environment. 

But now, scientists can have the best of both worlds. At the heart of the new technique is an electrochemical “liquid cell” sample holder with remarkable versatility. A thousand times thinner than a human hair, the device is compatible with both STEM and X-ray instruments. 

The electrochemical liquid cell’s ultrathin design allows reliable imaging of delicate samples while protecting them from electron beam damage. A special electrode custom-designed by co-author Cheng Wang, a staff scientist at Berkeley Lab’s Advanced Light Source, enabled the team to conduct X-ray experiments with the electrochemical liquid cell. Combining the two allows researchers to comprehensively characterize electrochemical reactions in real time and at the nanoscale. 

Getting granular

During 4D-STEM experiments, Yao Yang and team used the new electrochemical liquid cell to observe copper nanoparticles (ranging in size from 7 nanometers to 18 nanometers) evolve into active nanograins during CO2 electrolysis – a process that uses electricity to drive a reaction on the surface of an electrocatalyst. 

The experiments revealed a surprise: copper nanoparticles combined into larger metallic copper “nanograins” within seconds of the electrochemical reaction. 

To learn more, the team turned to Wang, who pioneered a technique known as “resonant soft X-ray scattering (RSoXS) for soft materials,” at the Advanced Light Source more than 10 years ago. 

With help from Wang, the research team used the same electrochemical liquid cell, but this time during RSoXS experiments, to determine whether copper nanograins facilitate COreduction. Soft X-rays are ideal for studying how copper electrocatalysts evolve during CO2 reduction, Wang explained. By using RSoXS, researchers can monitor multiple reactions between thousands of nanoparticles in real time, and accurately identify chemical reactants and products. 

The RSoXS experiments at the Advanced Light Source – along with additional evidence gathered at Cornell High Energy Synchrotron Source (CHESS) – proved that metallic copper nanograins serve as active sites for CO2 reduction. (Metallic copper, also known as copper(0), is a form of the element copper.) 

During CO2 electrolysis, the copper nanoparticles change their structure during a process called “electrochemical scrambling.” The copper nanoparticles’ surface layer of oxide degrades, creating open sites on the copper surface for CO2 molecules to attach, explained Peidong Yang. And as CO2 “docks” or binds to the copper nanograin surface, electrons are then transferred to CO2, causing a reaction that simultaneously produces ethylene, ethanol, and propanol along with other multicarbon products. 

“The copper nanograins essentially turn into little chemical manufacturing factories,” Yao Yang said.

Further experiments at the Molecular Foundry, the Advanced Light Source, and CHESS revealed that size matters. All of the 7-nanometer copper nanoparticles participated in CO2 reduction, whereas the larger nanoparticles did not. In addition, the team learned that only metallic copper can efficiently reduce COinto multicarbon products. The findings have implications for “rationally designing efficient CO2 electrocatalysts,” Peidong Yang said.

The new study also validated Peidong Yang’s findings from 2017: That the 7-nanometer-sized copper nanoparticles require low inputs of energy to start CO2 reduction. As an electrocatalyst, the 7-nanometer copper nanoparticles required a record-low driving force that is about 300 millivolts less than typical bulk copper electrocatalysts. The best-performing catalysts that produce multicarbon products from CO2 typically operate at high driving force of 1 volt.

The copper nanograins could potentially boost the energy efficiency and productivity of some catalysts designed for artificial photosynthesis, a field of research that aims to produce solar fuels from sunlight, water, and CO2. Currently, researchers within the Department of Energy-funded Liquid Sunlight Alliance (LiSA) plan to use the copper nanograin catalysts in the design of future solar fuel devices. 

“The technique’s ability to record real-time movies of a chemical process opens up exciting opportunities to study many other electrochemical energy conversion processes. It’s a huge breakthrough, and it would not have been possible without Yao and his pioneering work,” Peidong Yang said. 

Researchers from Berkeley Lab, UC Berkeley, and Cornell University contributed to the work. Other authors on the paper include co-first authors Sheena Louisa and Sunmoon Yu, former UC Berkeley Ph.D. students in Peidong Yang’s group, along with Jianbo Jin, Inwhan Roh, Chubai Chen, Maria V. Fonseca Guzman, Julian Feijóo, Peng-Cheng Chen, Hongsen Wang, Christopher Pollock, Xin Huang, Yu-Tsuan Shao, Cheng Wang, David A. Muller, and Héctor D. Abruña.

Parts of the experiments were performed by Yao Yang at Cornell under the supervision of Héctor Abruña, professor of chemistry and chemical biology, and David A. Muller, professor of engineering. 

This work was supported by the DOE Office of Science. 

The Molecular Foundry and Advanced Light Source are user facilities at Berkeley Lab. 

(From left to right): Julian Feijoo, Jianbo Jin, Cheng Wang, Peidong Yang, Yao Yang, Inwhan Roh, and Maria Fonseca Guzman at the Advanced Light Source.

Yao Yang (center) loads a sample into the soft X-ray scattering chamber as Cheng Wang (left) and Peidong Yang (right) observe at the RSoXS Beamline (Beamline 11.0.1.2) at the Advanced Light Source.

CREDIT

Thor Swift/Berkeley Lab

Movie for evolution of Cu NPs. [VIDEO] | EurekAlert! Science News Releases

Video of a 4D-STEM experiment during which Yao Yang and team used the new electrochemical liquid cell to observe copper nanoparticles (ranging in size from 7 nanometers to 18 nanometers) evolve into active nanograins during CO2 electrolysis – a process that uses electricity to drive a reaction on the surface of an electrocatalyst. The new electrochemical liquid cell allows researchers to resolve images of objects smaller than 10 nanometers.

CREDIT

Yao Yang/Berkeley Lab. Courtesy of Nature.


Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 16 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy’s Office of Science.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

New superalloy could cut carbon emissions from power plants

Researchers repurpose 3D printing to discover high-performance material

Peer-Reviewed Publication

DOE/SANDIA NATIONAL LABORATORIES

Superalloy_1 

IMAGE: SANDIA NATIONAL LABORATORIES TECHNOLOGIST LEVI VAN BASTIAN WORKS TO PRINT MATERIAL ON THE LASER ENGINEERED NET SHAPING MACHINE, WHICH ALLOWS SCIENTISTS TO 3D PRINT NEW SUPERALLOYS. view more 

CREDIT: CRAIG FRITZ, SANDIA NATIONAL LABORATORIES

ALBUQUERQUE, N.M. — As the world looks for ways to cut greenhouse gas emissions, researchers from Sandia National Laboratories have shown that a new 3D-printed superalloy could help power plants generate more electricity while producing less carbon.

Sandia scientists, collaborating with researchers at Ames National Laboratory, Iowa State University and Bruker Corp., used a 3D printer to create a high-performance metal alloy, or superalloy, with an unusual composition that makes it stronger and lighter than state-of-the-art materials currently used in gas turbine machinery. The findings could have broad impacts across the energy sector as well as the aerospace and automotive industries, and hints at a new class of similar alloys waiting to be discovered.

“We’re showing that this material can access previously unobtainable combinations of high strength, low weight and high-temperature resiliency,” Sandia scientist Andrew Kustas said. “We think part of the reason we achieved this is because of the additive manufacturing approach.”

The team published their findings in the journal Applied Materials Today.

Material withstands high heat, essential for power plant turbines

About 80% of electricity in the U.S. comes from fossil fuel or nuclear power plants, according to the U.S. Energy Information Administration. Both types of facilities rely on heat to turn turbines that generate electricity. Power plant efficiency is limited by how hot metal turbine parts can get. If turbines can operate at higher temperatures, “then more energy can be converted to electricity while reducing the amount of waste heat released to the environment,” said Sal Rodriguez, a Sandia nuclear engineer who did not participate in the research.

Sandia’s experiments showed that the new superalloy — 42% aluminum, 25% titanium, 13% niobium, 8% zirconium, 8% molybdenum and 4% tantalum — was stronger at 800 degrees Celsius (1,472 degrees Fahrenheit) than many other high-performance alloys, including those currently used in turbine parts, and still stronger when it was brought back down to room temperature.

“This is therefore a win-win for more economical energy and for the environment,” Rodriguez said.

Energy is not the only industry that could benefit from the findings. Aerospace researchers seek out lightweight materials that stay strong in high heat. Additionally, Ames Lab scientist Nic Argibay said Ames and Sandia are partnering with industry to explore how alloys like this could be used in the automotive industry.

“Electronic structure theory led by Ames Lab was able to provide an understanding of the atomic origins of these useful properties, and we are now in the process of optimizing this new class of alloys to address manufacturing and scalability challenges,” Argibay said.

The Department of Energy and Sandia’s Laboratory Directed Research and Development program funded the research.

Discovery highlights changes in materials science

Additive manufacturing, also called 3D printing, is known as a versatile and energy-efficient manufacturing method. A common printing technique uses a high-power laser to flash-melt a material, usually a plastic or a metal. The printer then deposits that material in layers, building an object as the molten material rapidly cools and solidifies.

But this new research demonstrates how the technology also can be repurposed as a fast, efficient way to craft new materials. Sandia team members used a 3D printer to quickly melt together powdered metals and then immediately print a sample of it.

Sandia’s creation also represents a fundamental shift in alloy development because no single metal makes up more than half the material. By comparison, steel is about 98% iron combined with carbon, among other elements.

“Iron and a pinch of carbon changed the world,” Kustas said. “We have a lot of examples of where we have combined two or three elements to make a useful engineering alloy. Now, we’re starting to go into four or five or beyond within a single material. And that’s when it really starts to get interesting and challenging from materials science and metallurgical perspectives.”

Scalability, cost are challenges to overcome

Moving forward, the team is interested in exploring whether advanced computer modeling techniques could help researchers discover more members of what could be a new class of high-performance, additive manufacturing-forward superalloys.

“These are extremely complex mixtures,” said Sandia scientist Michael Chandross, an expert in atomic-scale computer modeling who was not directly involved in the study. “All these metals interact at the microscopic — even the atomic — level, and it’s those interactions that really determine how strong a metal is, how malleable it is, what its melting point will be and so forth. Our model takes a lot of the guesswork out of metallurgy because it can calculate all that and enable us to predict the performance of a new material before we fabricate it.”

Kustas said there are challenges ahead. For one, it could be difficult to produce the new superalloy in large volumes without microscopic cracks, which is a general challenge in additive manufacturing. He also said the materials that go into the alloy are expensive. So, the alloy might not be appropriate in consumer goods for which keeping cost down is a primary concern.

“With all those caveats, if this is scalable and we can make a bulk part out of this, it’s a game changer,” Kustas said.

Sandia National Laboratories is a multimission laboratory operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energy’s National Nuclear Security Administration. Sandia Labs has major research and development responsibilities in nuclear deterrence, global security, defense, energy technologies and economic competitiveness, with main facilities in Albuquerque, New Mexico, and Livermore, California.

Sandia National Laboratories technologist Levi Van Bastian fills a hopper with raw material to print on the Laser Engineered Net Shaping machine. What appears to be a liquid is powdered metal.

3D-printing technologies like Laser Engineered Net Shaping, shown here, are helping scientists at Sandia National Laboratories rapidly discover, prototype and test new materials.

CREDIT

Craig Fritz, Sandia National Laboratories

High intensity fires do not reverse bush encroachment in an African savanna

Despite an initial short-term effect, high-intensity fires did not result in a meaningful reversal in bush encroachment in the long-term

Peer-Reviewed Publication

STELLENBOSCH UNIVERSITY

High-intensity fire in the Kruger National Park, South Africa 

IMAGE: A TYPICAL FIRE IN SOUTH AFRICA'S KRUGER NATIONAL PARK SHOWING HOW BOTH THE GRASSES AND SHRUBS BURN. view more 

CREDIT: BRIAN VAN WILGEN

A decade-long experiment on the use of high-intensity fire to control bush encroachment in South Africa’s Kruger National Park (KNP) has revealed that, despite an initial short-term effect, these fires did not result in a meaningful reversal in bush encroachment in the long-term.

The results of this long-term and large-scale experiment were published in the Journal of Applied Ecology today (16 February 2023), in an article titled “High-intensity fires may have limited medium-term effectiveness for reversing woody plant encroachment in an African savanna”.

Prof. Brian van Wilgen, emeritus professor in invasion biology at Stellenbosch University (SU) and one of the co-authors, says most papers that advocate the use of high-intensity fires are based on observation over a short time: “I think this is the first study to assess this management practice’s effectiveness over a decade,” he adds.

Decade-long experiment

Bush or woody encroachment, which is a process where the density of smaller trees and shrubs are increasing, is taking place across the world. This is worrying because increased woody cover results in less grass available for animals that eat grass or use it as habitat. Although the causes are complex and not that clear, a decrease in the use of fire, coupled with an increase in CO2, may be to blame in savannas.

In 2010 and again in 2013, adjacent sites in the southern parts of the KNP, covering thousands of hectares, were burned using low, medium and high intensity fires. The objective was to examine whether high-intensity fires can be used to reverse bush encroachment. At the time of the first burn application, KNP officials had to deal with substantial negative publicity after some animals were mortally injured.

study reporting the effects on woody plant cover one year after the completion of the second set of experimental fires showed that woody plant cover did indeed decline significantly over this short period. The initial study, however, cautioned that the long-term efficacy of these high intensity fires still needed to be confirmed.

In 2020, using a combination of ground surveys and remote sensing data, scientists from South African National Parks (SANParks), Harvard University and the Centre for Invasion Biology at Stellenbosch University compared the 2010 data with their 2020 observations. Despite the initial encouraging results, they found that the reduced woody plant cover was not maintained after ten years. Even though there were large differences between the fire treatment sites one year after the experimental fires, these differences had disappeared after 10 years, strongly suggesting that differences in fire intensity did not have a long-term effect on bush encroachment.

Even more concerning was the trend of tall tree loss in the KNP. Over the past decade, trees taller than ten metres declined by about 65% across all the experimental sites, regardless of the fire treatment.

According to SANParks scientist and first author on the article, Tercia Strydom, this was because of a combination of elephant and fire damage: “Tall trees are normally capable of withstanding frequent savanna fires due to their thick bark. But when elephants debark a tree (because they eat it), that thick protective bark is removed, exposing the inner wood. When this inner wood dries out, it burns and smoulders within the tree until the tree eventually succumbs,” she explains.

Lessons learned

According to Prof. van Wilgen, the paper shows the value of long-term monitoring to establish the real outcome, and not to jump to conclusions: “It also shows how complex things can be, and that there is no easy fix to bush encroachment”.

Dr Izak Smit, senior scientist at SANParks and another co-author, says the park continues to follow a strategic adaptive management approach in which local context is paramount: “Best available knowledge needs to be gathered from the literature, and based on that a careful evaluation must inform what management action you consider appropriate to implement in your area of interest. The outcomes of this action should be carefully monitored to see whether or not you actually obtain the anticipated outcomes”.

“This way, ecology reveals its complexity and nuance in different contexts and allows us to fine-tune the way we manage protected areas – learning as we continue to manage in the face of uncertainty,” he adds.

They warn that, on a global scale, inter-continental and even intra-continental generalisations regarding the most appropriate management actions for bush encroachment remain elusive. According to Strydom, this case study may not even be representative of other parts of southern Africa based on differences in elephant density, soil type and rainfall, amongst a plethora of other factors.

The way forward

KNP has a long history of scientific research and looking forward, Dr Smit says they are now exploring the potential of early wet or early dry season fires to address bush encroachment as an alternative to high-intensity fires.

He explains: “When woody species are dormant during the dry season, they move a lot of their resources to their roots while the top parts are ‘resting’. If a fire consumes the above-ground parts during this period, the plant still has a lot of resources stored below-ground from which it can resprout when conditions become favourable. However, if the plant is in leaf and a fire burns the above-ground parts, it is expected that the damage is more costly to the plant, which is what we want when concerned about bush encroachment. The trick is therefore to try to burn when there is sufficient dry grass to carry fires of reasonable intensity, while at the same time the bush species are in a more vulnerable leaf-on phase. Spring sometimes provides this window of opportunity where grass is still dry enough to burn and the woody species are already in leaf and thus more vulnerable to fire damage”.

In the end, he says, one needs to understand both the local (e.g., fire, grazing) and global drivers of change (such as elevated atmospheric CO2), and their interactions with each other, in order to tease out if and how different management actions (or inactions) may improve the situation, or even make it worse.

Vegetation in the Kruger National Park showing separate layers – scattered tall trees that are declining in number due to the actions of fire and elephants, and an understory of low shrubs that is becoming more dense”.

CREDIT

Brian van Wilgen


New study settles long-standing debate: Does agricultural erosion create a carbon sink or source

In new research published today in the European Geosciences Union journal Biogeosciences, two scientists address the soil organic carbon erosion paradox

Peer-Reviewed Publication

EUROPEAN GEOSCIENCES UNIO

Schematic representation of the effect of water erosion and deposition on soil OC stabilization and loss processes 

IMAGE: TRANSPORT IN RUNOFF: DETACHMENT AND TRANSPORT CAN SHIFT OC FROM A PROTECTED STATE IN AGGREGATES TO AN AVAILABLE STATE WHERE IT MINERALIZES MORE RAPIDLY. BURIAL: THE DEPOSITION OF ERODED OC MOVES OC INTO A LOW-MINERALIZATION CONTEXT AND CAN ALSO ENHANCE PROTECTION VIA AGGREGATION. SUBSOIL MIXING: AT SITES OF EROSION NEW OC FORMATION FROM NEW VEGETATION INPUTS INTO EXPOSED SUBSOIL BY EROSION MAY REPLACE SOME OF THE ERODED. OC. NET PRIMARY PRODUCTION (NPP) FEEDBACK: EROSION AND DEPOSITION MAY AFFECT THE NUTRIENT AND SOIL DEPTH STATUS (AND HENCE SOIL FERTILITY) AS WELL AS THE ENVIRONMENTAL FACTORS THAT CONTROL OC INPUT VERSUS OUTPUT. view more 

CREDIT: VAN OOST, K. AND SIX, J.: RECONCILING THE PARADOX OF SOIL ORGANIC CARBON EROSION BY WATER, BIOGEOSCIENCES, 20, 635–646, HTTPS://DOI.ORG/10.5194/BG-20-635-2023, 2023.

Over the last decade, researchers have sounded the alarm on soil erosion being the biggest threat to global food security. As world governments moved to implement soil conservation practices, a new debate began: does agricultural soil erosion create a net organic carbon (OC) sink or source? The question is a crucial one, as carbon sinks absorb more carbon than they release, while carbon sources release more carbon than they absorb. Either way, the answer has implications for global land use, soil conservation practices and their link to climate change.

In a new study published today in the European Geosciences Union journal Biogeosciences, two researchers show that the apparent soil organic carbon erosion paradox, i.e., whether agricultural erosion results in an OC sink or source, can be reconciled when we consider the geographical and historical context. The study was the result of a collaboration between UCLouvain, Belgium and ETH Zurich.

The organic carbon cascade
Early studies assumed that a substantial fraction of soil organic carbon that is mobilized on agricultural land is lost to the atmosphere. They concluded that agricultural erosion represented a source of atmospheric CO2, which led to the notion of a win–win situation: soil conservation practices that reduce erosion result in healthier soils AND a large carbon sink.

However, more recent studies have challenged this assumption and suggest a different pathway for the eroded organic carbon. They propose the concept of the “geomorphic OC pump” that transfers organic carbon from the atmosphere to upland soils recovering from erosion to burial sites where organic carbon is protected from decomposition in low-mineralization contexts. Along this geomorphic conveyor belt, the organic carbon originally fixed by plants is continuously displaced laterally along the earth’s surface where it can be stored in sedimentary environments. These studies argue that the combination of organic carbon recovery and sedimentation on land could capture vast quantities of atmospheric carbon, and so erosion may in fact represent an organic carbon sink.

“We demonstrate how these two competing views can exist at the same time and so this study offers an understanding of differences in perspective,” explains Kristof Van Oost from the Earth & Life Institute, UCLouvain.

Seeing the full picture for the first time
Johan Six from the Department of Environmental Systems Science, Swiss Federal Institute of Technology, and ETH Zurich says these latest findings are a first account of how all the different carbon dynamic processes induced by erosion interact and counterbalance each other in determining the net carbon flux from terrestrial environments to the atmosphere.

Six and Van Oost conducted a comprehensive literature review spanning 74 studies. Six explains the reason for the conflicting assumptions from previous studies. “We noticed that the perceived paradox was mostly related to not having considered the full cascade of carbon fluxes associated with erosion. This led us to thinking that it would be good to explain the complexity of the full carbon cascade.”

At the very centre of this paradox – they realised – is the fact that water erosion-induced processes operate across temporal and spatial scales, which determine the relationship between water erosion and organic carbon loss versus stabilization processes. Together they conceptualized the effects of the contributing water erosional (sub)processes across time and space using decay functions.

Timescales reconcile the paradox
Both researchers found that soil erosion induces a source for atmospheric CO2 only when considering small temporal and spatial scales, while both sinks and sources appear when multi-scaled approaches are used.

At very short timescales (seconds to days) erosion events shift a portion of the soil organic carbon from a protected state to an available state where it mineralizes to gaseous forms more rapidly. In contrast, studies considering erosion as a sink for atmospheric carbon typically consider longer timescales at which the geomorphic OC conveyor belt is operating.

The researchers emphasize the need for erosion control for the many benefits it brings to the ecosystem but recommend cross-scale approaches to accurately represent erosion effects on the global carbon cycle.

Looking to the future, Van Oost concludes, “Our insights into the effects of soil erosion on carbon storage are mainly derived from studies conducted in temperate regions. We now need new research on erosion effects in marginal lands but also tropical regions.”

Framework to represent fraction gain and loss relative to mobilised soil OC for different components of the geomorphic cascade 

Climate change disrupts core habitats of marine species

A modelling study by an international team of researchers indicates the extent to which climate change threatens marine ecosystems and their biodiversity.

Peer-Reviewed Publication

UNIVERSITY OF OLDENBURG

If climate change continues at the current pace, a majority of marine species will likely lose considerable amounts of their currently suitable habitat ranges by the end of this century. This is the result of a modelling study published in the current issue of the scientific journal Global Change Biology.

The interdisciplinary team of researchers included scientists of the Helmholtz Institute for Functional Marine biodiversity at the University of Oldenburg (HIFMB), the Alfred Wegener Institute Helmholtz Centre for Polar and Marine Research (AWI) and the GEOMAR Helmholtz Centre for Ocean Research in Kiel.

“Ocean’s biodiversity changes faster than in terrestrial ecosystems. To be able to protect marine species and with them all the marine resources that humans depend on, it is important to understand where and how marine species communities may change”, emphasizes Dr Irene Roca, biologist and former researcher at the HIFMB, who led the study together with HIFMB marine ecologist Dr Dorothee Hodapp.

Data of more than 33.500 marine species

Scientists are already observing that many marine species have started shifting their distributional ranges with the changing environmental conditions as a consequence of global warming. However, understanding and projecting what marine biodiversity might look like in the future and how the extent of habitats might change is a difficult task due to many unknowns, Hodapp points out.

“Many species are only poorly studied and we don’t know exactly how the environmental conditions will look like in about in a few decades”, she says. Moreover, previous projections often considered temperature as the sole environmental factor driving future biodiversity changes.

To overcome these problems to a certain extent, the researchers based their modelling efforts on occurrence data of more than 33.500 marine species and seven environmental factors such as water depth, water temperature, salinity, and oxygen concentration. Based on this information and assuming three different CO2 emission scenarios the team estimated whether and where the species are likely to occur in the future.

The results indicate that species’ so-called core habitat ranges – that is the marine area in which chances are higher than 50 percent that a particular species occurs based on its preferred environmental conditions – may not only shift but may also be considerably reduced in case of the high CO2 emission scenario.

Risk of a fundamental reorganization of marine life poses challenges to conservation management

In addition to habitat loss, the results give an idea about how the preferred habitat area of many species may be disrupted. “Especially along the equator, our model projections revealed areas which are ill-suited for most marine species, for instance because of high temperatures”, Roca explains. If such regions developed in the future this would disrupt currently continuous equatorial habitat ranges.

Fragmented habitats lead to smaller population sizes which can put species at higher risk to go extinct. However, in the long-run new species could also develop. Another problem is that species can only keep pace with changing environmental conditions to varying degrees, Hodapp explains. This can lead to a restructuring of food webs and changes in the interactions between habitat-forming species, such as corals, and their inhabitants.

“Even though our model does not account for such interspecific interactions, the results provide valuable clues on how differently marine environments and communities are likely to change depending on the future CO2 emission scenarios”, the marine ecologist stresses.

Being aware of such a high risk of a fundamental reorganization of marine life will pose further challenges to conservation management, she adds. “We need to think ahead and work on effectively implementing the recent international agreements on biodiversity protection.”