Wednesday, October 15, 2025

 

Locking carbon in trees and soils could help ‘stabilize climate for centuries’ – but only if combined with underground storage


University of Cambridge



  • New study on a ‘portfolio approach’ to carbon removal enables firms to mix expensive tech-based solutions that inject carbon deep underground with lower-cost and currently more available nature-based options.
     
  • The research can identify which corporate portfolios could best stabilise global temperatures over centuries and suggests that, with the right ‘buffer’, even those projects at higher risk of carbon re-release – such as forests and biochar – could help towards this long-term goal.
     
  • However, portfolios must plan to be 100% geological storage by the mid-century date of net zero to be credible, say researchers.


A team of researchers, led by Cambridge University, has now formulated a method to assess whether carbon removal portfolios can help limit global warming over centuries. The approach also distinguishes between buying credits to offset risk versus claiming net-negative emissions.

The study paves the way for nature-based carbon removal projects – such as planting new forests or restoring existing ones – to become effective climate change solutions when balanced with a portfolio of other removal techniques, according to researchers.

They say the findings, published in the journal Joule, show how nature-based and technology-based carbon storage solutions can work together through the transition to net zero, challenging the notion that only permanent tech-based “geological storage” can effectively tackle climate change.

The study’s authors point out that some carbon removal portfolios, such as California’s forest carbon offsets programme, may be severely underfunded for risks beyond the next few decades.

They call for a “buffer” of around two tonnes of stored carbon for every tonne offset in portfolios containing nature-based solutions, noting that this is “sufficient in most cases” to manage long-term risks.

However, researchers say the most high-risk portfolios that rely heavily on nature-based offsetting might need extreme buffers of nine tonnes of carbon removed for every tonne emitted. The authors caution against the use of such portfolios given the costs and uncertainties involved.

“Tech giants like Microsoft and Meta are collectively spending billions on carbon removal portfolios to offset their growing carbon footprints,” said lead author Dr Conor Hickey, Assistant Professor in Energy and Climate at Cambridge University’s Department of Land Economy.

“While companies and countries agree that increased investment in carbon removal is essential to reach net zero targets, they also want to understand whether carbon removal schemes can help stabilise global temperatures over the long term.”

“Our risk management approach offers one of the first reliable measures for portfolio managers targeting long-term temperature stabilisation,” said Hickey. “It shows that nature-based carbon storage such as tree planting has a bigger role to play than critics assume when used as part of a diversified carbon removal portfolio.”

“Durable net zero means geological net zero,” said Professor Myles Allen, a co-author on the paper and Professor of Geosystem Science at the University of Oxford. “To stabilise climate in line with Paris Agreement goals, anyone still relying on offsets must plan to shift entirely to carbon dioxide removal with geological storage by the middle of the century.”

Current market incentives favour cheaper and more available ‘biological’ projects to pull carbon dioxide (CO₂) from the atmosphere and store it, such as forestry, which locks carbon in trees, or biochar, where plant materials are heated to create a charcoal-like substance that traps carbon when incorporated into soil.

However, these methods carry a higher risk of carbon re-release, such as when land use changes or wildfires increase. They are often considered only a temporary solution – the carbon is not locked away for long enough to stem rising global temperatures.

Alternative tech-based solutions like Direct Air Capture (DAC) are proving hard to grow at scale when costs remain high and the process energy-intensive. Yet the permanence of the carbon storage means this emerging technology is less vulnerable to reversal, such as through leakage. DAC can be combined with deep underground storage to lock the CO₂ away.

For the latest study, the research team have developed a new “risk management framework” to accurately calculate the additional CO₂ removal needed to keep temperatures stable over centuries for various storage portfolios.

Their analysis shows that in some cases, such as a high-risk portfolio dominated by forestry projects, the extra amount of CO₂ removal needed to make up for this risk doesn’t change much – whether the timescale is 300 or even 1,000 years.

“Removing more carbon now can effectively cover carbon storage risk for centuries, and this can be done with a mix of nature and tech, as long as the right buffers are built in,” said Hickey. 

“Portfolios can combine expensive permanent solutions like DAC with lower-cost nature-based options like planting trees – matching society's willingness to pay while still contributing to temperature stabilisation goals.”

“Our approach enables strategic carbon storage choices based on current availability, while targeting long-term temperature stabilisation. It provides buyer flexibility while valuing lower-risk storage options, something today's market lacks,” said Hickey.

By 2050, the UK aims to achieve net zero, with geological storage expected to play a major role in storing any ongoing CO₂ emissions. Incoming UK and EU guidance states that projects must be subject to a minimum 200-year permanence requirement. 

 

Australia’s rainforests first to switch from carbon sink to source



Australian National University
Australia’s rainforests first to switch from carbon sink to source 

image: 

Credit: Adrienne Nicotra/ANU 

view more 

Credit: Credit: Adrienne Nicotra/ANU




The trunks and branches of trees in Australia's tropical rainforests – also known as woody biomass – have become a net source of carbon dioxide to the atmosphere, according to a new international study.  

According to the team behind the Nature study, which includes experts from The Australian National University (ANU), Australia’s wet tropics are the first globally to show this response to climate change. The rising temperature, air dryness and droughts caused by human-driven climate change are likely the major culprits.

Usually, tropical forests absorb more carbon than they release – what's known as a carbon sink. Woody biomass plays a key role in this process, alongside forest canopies and soils.  

But lead author Dr Hannah Carle, from Western Sydney University, said the capacity of woody biomass to continue working as a carbon sink is at risk.  

"Tropical forests are among the most carbon-rich ecosystems on the planet. We rely on them more than most people realise," Dr Carle, who conducted this work as part of her PhD at ANU, said.   

"Forests help to curb the worst effects of climate change by absorbing some of the carbon dioxide released from burning fossil fuels. But our work shows this is under threat. 

"The change our study describes is largely due to increased tree mortality driven by climate change, including increasingly extreme temperatures, atmospheric dryness and drought. 

“Regrettably, the associated increase in carbon losses to the atmosphere has not been offset by increased tree growth. This is surprising because higher carbon dioxide levels should make it easier for plants to scavenge carbon dioxide from the air, leading to more tree growth and greater carbon sink capacity.” 

The findings have significant implications for emissions reduction targets, which are partly based on the estimated capacity of forests to continue to absorb emissions and help mitigate climate change. 

"Current models may overestimate the capacity of tropical forests to help offset fossil fuel emissions," Dr Carle said.  

"We also found that cyclones suppress the carbon sink capacity of woody biomass in these forests. This is cause for concern with cyclones projected to become increasingly severe under climate change, and to impact areas further south, affecting additional stretches of forest to a potentially greater extent.” 

Co-author Professor Adrienne Nicotra from ANU added: “The rainforest sites at the heart of this research provide unusually long-term and high-resolution data on forest health through time. We need to pay attention to that data."  


Rising seas and sinking cities signal a coastal crisis in China



A Rutgers study of geological records shows sea level increasing the fastest in 4,000 years, highlighting need for global and local action



Rutgers University

Sea level distributions 

image: 

The study's data shows that modern, global sea level rise is happening faster than at any time in the past 4,000 years.

view more 

Credit: Yucheng Lin





A team of scientists led by Rutgers researchers has uncovered evidence that modern sea level rise is happening faster than at any time in the past 4,000 years, with China’s coastal cities especially at risk.

The scientists examined thousands of geological records from a number of sources, including ancient coral reefs and mangroves, which serve as natural archives of past sea levels. They reconstructed sea level changes going back nearly 12,000 years, which marks the beginning of the current geological epoch, the Holocene, which followed the last major ice age.

Reporting in Nature, their findings show that since 1900, global sea levels have risen at an average rate of 1.5 millimeters (or about one-sixteenth of an inch) a year, a pace that exceeds any century-long period in the past four millennia.

“The global mean sea level rise rate since 1900 is the fastest rate over at least the last four millennia,” said Yucheng Lin, who conducted the research as a postdoctoral associate at Rutgers and is a scientist at Australia’s national research agency, the Commonwealth Scientific and Industrial Research Organization in Hobart.

Lin studied with Robert Kopp, a Distinguished Professor with the Department of Earth and Planetary Sciences in the School of Arts and Sciences. “Dr Lin's work illustrates how geological data can help us better understand the hazards that coastal cities face today,” said Kopp, who also authored the study.

Two major forces, thermal expansion and melting glaciers, are driving this acceleration, Lin said. As the planet warms because of climate change, oceans absorb heat and expand. At the same time, ice sheets in Greenland and Antarctica are melting, adding more water to the oceans.

“Getting warmer makes your ocean take up more volume,” Lin said. “And the glaciers respond faster because they are smaller than the ice sheets, which are often the size of continents. We are seeing more and more acceleration in Greenland now.”

While rising seas are a global issue, China faces a unique double threat, he said. Many of its largest and most economically important cities, including Shanghai, Shenzhen and Hong Kong, are in delta regions, which are naturally prone to sinking because they were built above thick and soft sediments.

But human activities are making things worse.

“We’ve been able to quantify the natural rate of sea level rise for this area,” Lin said. “But human intervention, mostly groundwater extraction, makes it happen much faster.”

Subsidence refers to the gradual sinking or settling of the Earth's surface. It can happen naturally because of geological processes, or it can be caused by human activities, such as groundwater extraction.

To determine how sea level rise will adversely affect China’s deltas, the team examined a combination of geological records, subsidence data and human activity impacts across coastal regions, especially in the Yangtze River Delta and Pearl River Delta. These areas are home to several megacities.

In Shanghai, parts of the city sank more than one meter (about three feet) during the 20th century because of excessive groundwater use, Lin said. That is orders of magnitude faster than the current global sea level rise rate.

Delta regions are flat, fertile and close to water, making them ideal for farming, transportation and urban development. But their geography also makes them extremely vulnerable to flooding.

“Centimeters of sea level rise will greatly increase the risk of flooding in deltas,” Lin said. “These areas are not only important domestically, they’re also international manufacturing hubs. If coastal risks happen there, the global supply chain will be vulnerable.”

Despite the findings, Lin’s research offers hope, he said. Cities such as Shanghai have already taken steps to reduce subsidence by regulating groundwater use and even reinjecting freshwater into underground aquifers.

“Shanghai now is not sinking that fast anymore,” Lin said. “They recognized the problem and started regulating their groundwater usage.”

The study also provides vulnerability maps to help governments and city planners identify subsidence hotspots and prepare for future sea level rise.

Although the researchers focused on China, lessons from the study apply globally, Lin said. Many major cities, such as New York, Jakarta and Manila, are built on low-lying coastal plains and face similar risks.

“Deltas are great places, good for farming, fishing, urban development and naturally draw civilizations to them,” Lin said. “But they are really flat yet prone to human-caused subsidence, so sustained sea level rise could submerge them really fast.”

The paper is an application of PaleoSTeHM, an open-source software framework for statistically modeling paleo-environmental data that Lin developed as a postdoctoral associate.

Praveen Kumar, a postdoctoral associate in the Department of Earth and Planetary Sciences, also contributed to the study.

The National Science Foundation and NASA supported the research.

Explore more of the ways Rutgers research is shaping the future.

 

Study indicates dramatic increase in percentage of US adults who meet new definition of obesity


Mass General Brigham researchers studied a new definition of obesity that moves beyond BMI to include measures of body fat distribution


Mass General Brigham




The prevalence of obesity in the United States could rise sharply under a new definition of obesity released earlier this year by the Lancet Diabetes and Endocrinology Commission. Researchers from Mass General Brigham found that when applying the new criteria, which expands upon the traditional use of body mass index (BMI) to include measures of body fat distribution, the prevalence of obesity increased from about 40 percent to about 70 percent among over 300,000 people included in their study. The rise was even more pronounced among older adults. Additionally, the researchers found that those newly added individuals also had a higher risk of adverse health outcomes. Their results are published in JAMA Network Open.

“We already thought we had an obesity epidemic, but this is astounding,” said co-first author Lindsay Fourman, MD, an endocrinologist in the Metabolism Unit in the Endocrinology Division of the Mass General Brigham Department of Medicine. “With potentially 70 percent of the adult population now considered to have excess fat, we need to better understand what treatment approaches to prioritize.”

Traditionally, obesity has been defined by BMI, which estimates body fat based on a person’s weight and height. But other anthropomorphic measures—such as waist circumference, waist-to-height ratio, or waist-to-hip ratio—may further account for fat distribution and aid in differentiation between muscle and fat mass.

Under the new framework, a person is classified as having obesity if they have a high BMI plus at least one elevated anthropometric measure (a condition the authors term “BMI-plus-anthropometric obesity”), or if they have a normal BMI and at least two elevated anthropometric measures (a condition termed “anthropometric-only obesity”). The new definition also distinguishes between preclinical and clinical obesity with clinical obesity defined as the presence of obesity-related physical impairment or organ dysfunction. At least 76 organizations have endorsed the new guidelines, including the American Heart Association and The Obesity Society.

The study analyzed participants in the National Institutes of Health All of Us Research Program’s cohort of over 300,000 Americans. Obesity prevalence was 68.6 percent with the new definition, versus 42.9 percent under the traditional BMI-based definition. This increase was entirely driven by inclusion of individuals with anthropometric-only obesity. Obesity rates varied by sex, race, and especially by age—affecting nearly 80 percent of adults over 70.

Importantly, the study found that those with anthropometric-only obesity – who would not have been classified as having obesity by the traditional definition – had a higher risk of diabetes, cardiovascular disease, and mortality than people without obesity. About half of all individuals who met the new obesity criteria had clinical obesity, and this proportion was only slightly lower in the anthropometric-only obesity group compared with the BMI-plus-anthropometric obesity group.

“We have always recognized the limitations of BMI as a single marker for obesity because it doesn't take into account body fat distribution,” said senior author Steven Grinspoon, MD, Chief of the Metabolism Unit in the Endocrinology Division of the Mass General Brigham Department of Medicine. “Seeing an increased risk of cardiovascular disease and diabetes in this new group of people with obesity, who were not considered to have obesity before, brings up interesting questions about obesity medications and other therapeutics.”

The researchers emphasize that further studies are needed to better understand the causes of and optimal treatments for anthropometric-only obesity. The research team previously developed a therapeutic that reduces waist circumference and plans to explore the utility of different treatment strategies in this newly defined population.

“Identifying excess body fat is very important as we’re finding that even people with a normal BMI but with abdominal fat accumulation are at increased health risk,” Fourman said. “Body composition matters – it’s not just pounds on a scale.”

Authorship: In addition to Fourman and Grinspoon, Mass General Brigham authors include Aya Awwad, Camille A. Dash, Julia E. Johnson, Allison K. Thistle, Nikhita Chahal, Sara L. Stockman, Mabel Toribio, Chika Anekwe, and Arijeet K. Gattu. Additional authors include Alba Gutiérrez-Sacristán.

Disclosures: Fourman serves as a consultant to Theratechnologies and Chiesi Farmaceutici and receives grant funding to her institution from Chiesi Farmaceutici outside of this work. Grinspoon serves as a consultant to Marathon Assets Management and Exavir Therapeutics and receives grant funding to his institution from Kowa Pharmaceuticals, Gilead Sciences, and Viiv Healthcare, unrelated to this project. For the remaining authors, no conflicts were declared.

Funding: This work was supported by the National Institutes of Health (grants K23HD100266, 1R01AG087809, T32DK007028, K23HL147799, 1R01HL173028, and P30DK040561) as well as the American Heart Association-Harold Amos Medical Research Faculty Development Program, supported by the Robert Wood Johnson Foundation, and the Robert A. Winn Excellence in Clinical Trials Award Program from the Bristol Meyers Squibb Foundation. The funding organizations played no role in the design and conduct of the study, collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

Paper cited: Fourman LT et al. “Implications of the Lancet Commission Obesity Definition Among the All of Us Cohort” JAMA Network Open DOI: 10.1001/jamanetworkopen.2025.37619

About Mass General Brigham

Mass General Brigham is an integrated academic health care system, uniting great minds to solve the hardest problems in medicine for our communities and the world. Mass General Brigham connects a full continuum of care across a system of academic medical centers, community and specialty hospitals, a health insurance plan, physician networks, community health centers, home care, and long-term care services. Mass General Brigham is a nonprofit organization committed to patient care, research, teaching, and service to the community. In addition, Mass General Brigham is one of the nation’s leading biomedical research organizations with several Harvard Medical School teaching hospitals. For more information, please visit massgeneralbrigham.org.