Saturday, November 22, 2025

 

Soil carbon decomposition varies vastly, holding implications for climate models




Iowa State University
Close-up of soil 

image: 

Organic carbon decomposition in soil varies significantly and in regional patterns, driven in part by factors such as soil minerals and microbial properties that have been underrepresented in carbon modeling that helps inform climate projections, according to a new research led by Iowa State University ecologists. 

view more 

Credit: Bo Yi




AMES, Iowa – Soil stores more carbon than Earth’s atmosphere and plants combined, which makes the speed of soil carbon’s decomposition an important variable in models used to predict changes to our climate.

A new study by a team that includes four Iowa State University researchers found that even under uniform laboratory conditions, the rate of organic carbon decomposition in soil samples collected across the U.S. differed by up to tenfold, in part due to variations in soil mineral and microbial properties – factors that are often underrepresented in current Earth systems models.

Updating models with an improved understanding of the decomposability of organic carbon in soil – and its subsequent carbon dioxide emissions – could improve the accuracy of soil carbon feedback estimates in models, leading to more refined climate projections, said Chaoqun Lu, associate professor of ecology, evolution and organismal biology.

“For modeling simulations, we’ve traditionally simplified these variations by assuming carbon in similar soil types or in similar biomes decomposes at the same base rate, if no environmental changes are present. However, our findings show that the base rate actually varied a lot, even within the same soil or biome type. So this will really change a common practice,” said Lu, the corresponding author of the study recently published in One Earth.

Leveraging lab data

Scientists who work on Earth systems models – complex simulations that estimate the global effects of intertwined biological, geochemical and physical processes – have long known the model estimates of soil carbon decomposition have large uncertainties. 

In hope of better quantifying those variations, Lu’s colleagues incubated soil samples from 20 sites in the National Ecological Observatory Network, a federally funded program that monitors ecosystems across the U.S. Over an 18-month period, researchers measured carbon dioxide emissions and key soil properties to inform a soil carbon model that estimated each sample’s decay rate (how fast organic matter breaks down) and carbon use efficiency (how much of the decomposed carbon is taken up by microbes).

Machine learning-assisted analysis helped show which of the 26 types of measurements taken from the soil samples were most strongly associated with decomposition variation, said study co-author Bo Yi, a former postdoctoral research associate in Lu’s lab and first author of the new study.

Some controlling factors were already well-established, such as soil type and levels of pH and nitrogen. Analyzing the incubation data also revealed a strong connection between decomposition rates and the levels of fungi and certain forms of iron and aluminum. The soil minerals are tightly linked to long-term stability of mineral-associated organic carbon, the portion of soil carbon that can persist in soil for decades or even hundreds of years.

Researchers combined their soil measurements with estimates of the base rates to build AI models that successfully captured the variations in those rates across 156 soil samples. They then applied that model to the continental U.S., creating maps that project carbon use efficiency and decay rates for individual land tracts measuring roughly 2.5 miles on each side. The maps show large regional variations in soil carbon dynamics across the U.S.

Implications for models and incentives

Scientists who work with soil carbon models or Earth systems models to project carbon-climate feedback are likely to use the study’s final parameter maps to improve their simulations, Lu said.

“These geochemical and microbial metrics drive a lot of variability, and we haven’t included them adequately in previous modeling work,” she said.

Lu said the study also shows models should account for how different components of soil carbon decompose, as mineral-associated organic carbon lasts much longer than particulate carbon – mostly plant-derived organic matter in soil that decays in years instead of centuries.

Beyond improved modeling, Lu said the research could also inform conservation and carbon market programs by revealing regional differences in soil carbon vulnerability. In the Southwest, organic carbon in soil tends to decompose more rapidly, and once it is decomposed, a greater proportion of that carbon is released into the atmosphere as carbon dioxide. In the Northwest and the East, soil carbon decomposes more slowly, and a larger share of decomposed carbon ends up being retained in the soil as microbial biomass. Most of the Midwest falls somewhere between the extremes.

Those differences suggest that incentives for increasing soil carbon sequestration should consider soil’s carbon retention persistence, she said.

“If carbon remains in the soil longer in certain areas, the same amount of carbon sequestration there could be more valuable than in other areas,” she said.

 

Building a sustainable metals infrastructure: NIST report highlights key strategies




National Institute of Standards and Technology (NIST)





The National Institute of Standards and Technology (NIST) has published a report identifying strategies for developing a more efficient, sustainable and resilient U.S. metals processing infrastructure, where metals are used and reused more efficiently throughout the economy. The report highlights key challenges that must be addressed to achieve this goal, including a lack of robust standards for recycled content and supply chain vulnerabilities for critical materials.

Metals processing covers the entire material life cycle, including mining, alloy design, manufacturing, reuse and recycling. Making metals processing more sustainable and resilient against supply chain disruptions and resource scarcity is key to promoting U.S. innovation and industrial competitiveness while strengthening economic and national security.

The report, Material Challenges in Developing a Sustainable Metal Processing Infrastructure, is based on a NIST workshop held in July 2024.

“The workshop brought together a diverse group of experts from industry, academia and the policy world to take on some of the biggest challenges in the metals processing space,” said NIST materials research engineer Andrew Iams, a co-author on the report. “Meeting these challenges requires a new approach in how to source, process, use and recycle metals.”

The report covers various topics related to metals manufacturing, from new technologies for extracting and processing bulk materials, like aluminum and steel, to developing new modeling and data tools to design advanced alloys.

The report highlights the importance of critical materials, including minerals containing lithium and cobalt that are key manufacturing elements for smartphones, batteries, semiconductors and medical devices, as well as superalloys used in military hardware and jet engines.

These materials can be challenging to obtain due to limited availability and the risk of supply chain disruptions. Industries can address these issues by diversifying their supply chains with new sources, identifying substitute materials, and improving recycling methods to enable greater recirculation of existing materials.

The report also highlights the need to improve standards for metals reuse and recycling. Better standards can make the separation of metals for recycling more efficient, reducing industry costs. New certification programs can help ensure that products made with recycled content meet performance standards, which could expand the market for recycled materials.

The report highlights five strategies that would help the industry tackle these and other challenges:

  • Advance measurement science for sustainable metals manufacturing, including new separation techniques for recycling.
  • Develop the technical basis to support standards development, including the data needed to create or improve performance-based standards for highly recycled metals, such as aluminum and steel.
  • Enhance data and modeling tools for addressing supply risks and designing products for improved recyclability.
  • Promote workforce development and education by establishing training programs and creating partnerships between universities, labs and industry.
  • Convene stakeholders to establish collaborations that foster knowledge-sharing and innovation.

The NIST workshop brought together manufacturers, technology companies, researchers and other experts from all stages of the metals processing chain. NIST has a long history of convening stakeholders across industrial sectors to solve shared problems through better technology and standards.

“We are always seeking ways to help industrial partners solve tough engineering or scientific problems,” Iams said. “Part of NIST’s mission is to help keep U.S. industry competitive. We can do that by identifying promising technologies and helping to move them out of the lab so they can be implemented on an industrial scale.”

 

Why does AI struggle with logical thinking? An Emmy Noether Research Group aims to find out



Saarland University
Why does AI struggle with logical thinking? An Emmy Noether Research Group aims to find out 

image: 

Computer linguistics professor Michael Hahn wants to fundamentally improve large language models such as ChatGPT.

view more 

Credit: Thorsten Mohr





AI assistants have rapidly become part of everyday life – whether through our interactions with large language models like ChatGPT or in medical applications that help interpret complex datasets. Yet, despite their widespread use, AI systems still make surprisingly simple mistakes that persist even after extensive training. They also lack the ability to think logically or to truly ‘understand’ nested input. For Michael Hahn, Professor of Computational Linguistics at Saarland University, the problem lies in the architecture itself. According to Hahn, the fundamental design of large language models needs to change. But before researchers can make progress on these issues, they first need to gain a deeper theoretical understanding of current structures. Michael Hahn will now be able to pursue this goal with the €1.4 million in funding he has been awarded from the German Research Foundation’s Emmy Noether Programme.

Today’s large language models (LLMs) are built on what is known as transformer architecture – a framework inspired by the human ability to focus attention on relevant information while ignoring less important details. Mathematically, this means prioritizing data that appears most relevant to the question at hand. ‘In doing so, these neural networks mimic another human trait, namely associative thinking, which is our ability to link ideas and memories,’ explains Professor Hahn. The AI searches vast datasets for patterns and connections, learning through continuous training. In this process of ongoing refinement, the system is exposed to new data and given corrective feedback enabling it to retrieve the right information at the right time for increasingly precise results.

But this approach has its flaws. ‘Serious errors can occur when the AI forms incorrect associations. These mistakes are compounded by the fact that current neural networks typically operate with a fixed number of layers in which the mathematical operations are carried out – thus limiting the network’s flexibility,’ explains Michael Hahn. Hahn and his team have already demonstrated mathematically that such networks make systematic errors – errors that cannot be eliminated by more training on even larger data sets or by using better prompts, i.e. more precise instructions to the AI.

According to Hahn, today’s large language models are hitting performance ceilings due to three main shortcomings. ‘Today’s LLMs are poor at handling changing conditions. They fail to update when a situation has changed.’ Hahn’s team tested this with a simple scenario in which several people pass two different books around a group. The AI’s job was to determine who holds which book at the end. The more times the books were passed around, the less accurate the AI became. In medical applications, this weakness can have potentially serious implications. ‘Medical AI systems generate connections between different types of data, such as diagnoses, medications and test results. If the AI does not assign the chronological sequence correctly and misinterprets the sequence of symptoms, diagnoses, test results and medication, there are potentially dangerous consequences for patients,’ says Hahn.

The second shortcoming is that today’s large language models lack logical reasoning. ‘AIs are not yet capable of thinking logically. Looking at the field of medicine again, if an AI is tasked with selecting the right medication for a specific clinical condition from a large database, it must be able to infer which symptoms correspond to that condition. Similarly, when assisting with a diagnosis, the AI must understand the rules doctors use to exclude certain diseases, which means it needs to understand how doctors rule out specific conditions if particular symptoms are absent. But this involves the systematic application of logical rules – something that is currently beyond the reach of today’s neural networks,’ says Hahn.

The third area where Hahn believes AI output is unreliable arises from their inability to process complex, nested inputs. ‘Large language models often fail to process intricate, layered information in a meaningful way. This becomes evident in legal contexts. Determining the liability of a person or company alleged to have harmed some other party requires an understanding of both the underlying legal principles involved and the chronology of the alleged events. Such reasoning chains, challenging even for humans, remain beyond the capabilities of the neural networks available today,’ explains Professor Hahn.

In his Emmy Noether research project, Michael Hahn will initially focus on the theoretical foundations of the transformer architecture. The aim is to gain a better understanding of the mathematical principles that underpin how neural networks arrive at their results. He will also explore how many layers these networks need in order to act more ‘intelligently’. In the next phase, Hahn plans to investigate hybrid systems or even design entirely new architectures that exhibit more predictable capabilities and that are both more reliable and more powerful than current large language models.

The Emmy Noether Programme, which is funded by the German Research Foundation (DFG), supports outstanding early-career researchers who have completed their doctorate within the past four years, have international experience and have completed a postdoctoral phase. With €1.4 million in funding, Hahn will now establish an Emmy Noether Research Group at Saarland University, working with five doctoral researchers on the project ‘Understanding and Overcoming Architectural Limitations in Neural Language Models’. This is already the third Emmy Noether Group to be approved for computer science research in Saarbrücken in 2025. The other two groups were recently launched at the Max Planck Institute for Computer Science (see press release dated 29 October 2025). This is a remarkable result given that nationwide only three Emmy Noether groups focused on computer science research were funded last year (see Gepris database).

Naval Research Laboratory and Florida International University advance non-contact detection of trace fentanyl



Naval Research Laboratory

Naval Research Laboratory and Florida International University Advance Non-Contact Detection of Trace Fentanyl 

image: 

Ashley Fulton, Ph.D., U.S. Naval Research Laboratory research chemist, uses an ion mobility spectrometer to test a vapor generated from a trace explosive sensor testbed in Washington, D.C., Nov. 12, 2025. Fulton’s research explores the development of non-contact fentanyl detection to increase the safety of first responders. (U.S. Navy photo by Sarah Peterson)

view more 

Credit: (U.S. Navy photo by Sarah Peterson)





WASHINGTON, D.C.  –  Scientists at the U.S. Naval Research Laboratory (NRL), working with Florida International University’s (FIU) Global Forensic and Justice Center, have demonstrated a new method for detecting trace levels of fentanyl using a silicon nanowire (SiNW) array that concentrates chemical vapors for handheld detection instruments. The breakthrough offers first responders a faster, safer way to identify fentanyl and related synthetic opioids without direct contact with the drug.

Fentanyl, a synthetic opioid roughly 50 times stronger than heroin, remains a leading cause of overdose deaths in the United States. Even trace amounts, as little as two milligrams, can be lethal if inhaled or absorbed through the skin. Current field methods for detecting fentanyl typically require handling bulk samples or destructive preparation, increasing the risk of accidental exposure and adding time to forensic analyses.

“Our sailors could come in contact with this substance, and it could possibly be used as a warfare agent. Having a detection method that’s readily available is important for protection,” said Ashley Fulton, Ph.D., chemistry researcher at NRL and principal investigator on the project.

The NRL–FIU team focused on developing a non-contact detection approach, a way to test for fentanyl without touching or handling it directly, to keep first responders safe. Instead of looking for fentanyl itself, they targeted a chemical marker the drug gives off in air called N-phenylpropanamide (NPPA), a tiny by-product molecule that forms as fentanyl breaks down and acts like a unique “fingerprint.”

The team used a sampling method called solid phase microextraction (SPME). A special fiber essentially “soaks up” trace chemicals from the air and a gas chromatography–mass spectrometry (GC-MS), a lab instrument that separates and identifies molecules, and analyzes those vapors. The method proved that NPPA is a reliable vapor signal for fentanyl and its close relatives. They then fine-tuned a portable ion mobility spectrometer (IMS), a handheld detector that measures how charged particles move through gas, so it could recognize the NPPA signal at amounts as small as five nanograms (about five-billionths of a gram) without being tricked by common drug fillers such as mannitol, lactose, or acetaminophen.

To further enhance sensitivity, the team integrated a silicon nanowire preconcentration array coated with an acrylate-based polymer. This configuration increased NPPA detection up to 14-fold in laboratory testing and maintained performance even in the presence of street-level adulterants. The silicon nanowire device adsorbs target vapors and then releases them as a concentrated pulse, allowing handheld detectors to register previously undetectable trace amounts.

“This work demonstrates a promising path toward rapid, non-contact detection of fentanyl that could help protect law enforcement officers, customs officials, and emergency responders,” Fulton said. ““Fentanyl is a highly potent drug that can cause overdose at very, very low concentrations. Law enforcement officers often feel high anxiety when they encounter unknown substances. This technology gives a preventative, presumptive identification so the proper care can be taken at the scene of a crime.”

Field tests with confiscated fentanyl samples from the Drug Enforcement Administration and Maryland State Police laboratories confirmed the method’s ability to detect NPPA in high-purity samples and complex street mixtures.

“The next step is demonstrating the full capability of the silicon nanowire system for trace detection,” Fulton said. “Our goal is to have a prototype ready by the end of 2026.”

This research was supported through an Interagency Agreement with the National Institute of Justice, Office of Justice Programs and the U.S. Department of Justice.

About the U.S. Naval Research Laboratory
NRL is a scientific and engineering command dedicated to research that drives innovative advances for the U.S. Navy and Marine Corps from the seafloor to space and in the information domain. NRL, located in Washington, D.C. with major field sites in Stennis Space Center, Mississippi; Key West, Florida; Monterey, California, and employs approximately 3,000 civilian scientists, engineers and support personnel.
 
NRL offers several mechanisms for collaborating with the broader scientific community, within and outside of the Federal government. These include Cooperative Research and Development Agreements (CRADAs), LP-CRADAs, Educational Partnership Agreements, agreements under the authority of 10 USC 4892, licensing agreements, FAR contracts, and other applicable agreements.
 
For more information, contact NRL Corporate Communications at NRLPAO@us.navy.mil.

Ashley Fulton, Ph.D., U.S. Naval Research Laboratory research chemist, poses for a photo with a trace explosive sensor testbed system in Washington, D.C., Nov. 12, 2025. Fulton’s research explores the development of non-contact fentanyl detection to increase the safety of first responders. (U.S. Navy photo by Sarah Peterson)

Credit

(U.S. Navy photo by Sarah Peterson)