Monday, August 18, 2025

 

New research ferments the perfect recipe for fine chocolate flavor





University of Nottingham

Cacao pod 

image: 

Cacao pod

view more 

Credit: Mimi Chu Leung





Researchers have identified key factors that influence the flavour of chocolate during the cocoa bean fermentation process, a discovery that could offer chocolate producers a powerful tool to craft consistently high-quality, flavour-rich chocolate.

Scientists from the University of Nottingham’s School of Biosciences examined how cacao bean temperature, pH, and microbial communities interact during fermentation and how these factors shape chocolate flavour. The team identified key microbial species and metabolic traits associated with fine-flavour chocolate and found that both abiotic factors (such as temperature and pH) and biotic factors (the microbial communities) are strong,consistent indicators of flavour development. The study has been published today in Nature Microbiology.

The quality and flavour of chocolate begin with the cacao bean, which is profoundly influenced by both pre- and post-harvest factors. Among these, fermentation is the first, and one of the most critical steps after harvest. It lays the foundation for aroma development, flavour complexity, and the reduction of bitterness in the final chocolate product.

 

Dr David Gopaulchan, the first author of the paper, from the School of Biosciences explains: “Fermentation is a natural, microbe-driven process that typically takes place directly on cocoa farms, where harvested beans are piled in boxes, heaps, or baskets. In these settings, naturally occurring bacteria and fungi from the surrounding environment break down the beans, producing key chemical compounds that underpin chocolate’s final taste and aroma. However, this spontaneous fermentation is largely uncontrolled. Farmers have little influence over which microbes dominate or how the fermentation process unfolds. As a result, fermentation, and thus the flavour and quality of the beans, varies widely between harvests, farms, regions, and countries.”

The researchers wanted to find out whether this unstable, natural process could be replicated and controlled in the lab. Working with Colombian farmers during the fermentation process they identified the factors that influence flavour. They were then able to use this knowledge to create a lab fermentation process and developed a defined microbial community, a curated mix of bacteria and fungi, capable of replicating the key chemical and sensory outcomes of traditional fermentations. This synthetic community successfully mimicked the dynamics of on-farm fermentations and produced chocolate with the same fine-flavour characteristics.

Dr David Gopaulchan adds: “The discoveries we have made are really important for helping chocolate producers to be able to consistently maximise their cocoa crops as we have shown they can rely on measurable markers such as specific pH, temperature, and microbial dynamics, to reliably predict and achieve consistent flavour outcomes. This research signals a shift from spontaneous, uncontrolled fermentations to a standardized, science-driven process. Just as starter cultures revolutionized beer and cheese production, cocoa fermentation is poised for its own transformation, powered by microbes, guided by data, and tailored for flavour excellence. By effectively domesticating the fermentation process, this work lays the foundation for a new era in chocolate production, where defined starter cultures can standardise fermentation, unlock novel flavour possibilities, and elevate chocolate quality on a global scale.”

 

People with sensitive personalities more likely to experience mental health problems



First ever systematic review and meta-analysis of its kind shows highly sensitive people are more likely to experience mental health problems




Queen Mary University of London





The meta-analysis of 33 studies, the first of its kind, looked at the relationship between sensitivity and common mental health problems such as depression and anxiety. Researchers found there was a significant, positive relationship between the two, concluding that highly sensitive people are more likely to experience depression and anxiety compared to those who are less sensitive.


In the study, sensitivity was defined as a personality trait that reflects people’s capacity to perceive and process environmental stimuli such as bright lights, subtle changes in the environment and other peoples’ moods. Often overlooked in mental health studies and clinical practice, which tend to focus on neuroticism and its association with mental health conditions, this research shows that understanding a person’s sensitivity level is important and can have therapeutic implications. 

For example, people with more sensitive personality traits may be more likely to benefit from treatment plans which involve techniques such as applied relaxation and mindfulness, which can also prevent relapse. 

Tom Falkenstein, a psychotherapist and a PhD student at Queen Mary University of London, said: 
“This is the most extensive systematic review on sensitivity and mental health in adolescents and adults to date, and is the first ever meta-analysis on the topic to estimate the impact of this relationship. We found positive and moderate correlations between sensitivity and various mental health problems such as depression, anxiety, post-traumatic stress disorder, agoraphobia and avoidant personality disorder. Our findings suggest that sensitivity should be considered more in clinical practice which could be used to improve diagnosis of conditions.”

“In addition, our findings could help improve treatment for these individuals. Around 31% of the general population are considered highly sensitive, and, as our findings show, are more likely to respond better to some psychological interventions than less sensitive individuals. Therefore, sensitivity should be considered when thinking about treatment plans for mental health conditions. Our work shows it is crucial that the awareness of sensitivity is improved among mental health care professionals, so clinicians and practitioners can recognise the trait in their patients, and tailor treatment to their sensitivity.”

Michael Pluess, Professor in Developmental Psychology at University of Surrey and Visiting Professor at Queen Mary University of London said:

“This is the first meta-analysis providing robust evidence that highly sensitive people are more prone to common mental health problems. However, it is important to remember that highly sensitive people are also more responsive to positive experiences, including psychological treatment. Our results provide further evidence that sensitive people are more affected by both negative and positive experiences and that the quality of their environment is particularly important for their well-being.”


The systematic review and meta analysis of 33 studies was carried out by an academic team from several universities including Queen Mary University and the University of Surrey. 

It will be available here at 0800  BST on Saturday 16th August.

Authors include Tom Falkenstein (Queen Mary University of London), Luke Satori (Kings College London), Margherita Malanchini, (Queen Mary University of London) Kristin Hadfield (Trinity College Dublin) and Michael Pluess (University of Surrey).

 

What do you do if your dog ingests cocaine? How one researcher is trying to protect pets from future accidents



Q&A with Dr Jake Johnson, cardiology resident at North Carolina State University’s College of Veterinary Medicine and author of a recent Frontiers in Veterinary Science article



Frontiers





What inspired you to become a researcher?

I was introduced to the One Health Initiative during my undergraduate studies, which opened my eyes to the power of collaborative science. This initiative promotes collaboration between veterinary medicine, human medicine, environmental disciplines, and other scientific fields to advance our collective well-being. What fascinated me was how interconnected our health challenges really are – diseases don't respect boundaries between species or environments. I realized that the discoveries we make in veterinary medicine often have broader implications, and research provides the platform to share these insights across disciplines. Ultimately, I believe all our knowledge has greater value when it's passed onto others, and research gives me the opportunity to be part of that knowledge-sharing ecosystem that can create ripple effects far beyond any single case I might treat.

 

Can you tell us about the research you’re currently working on?

Throughout my cardiology residency and beyond, I'm focusing my research efforts on bridging the gap between advanced cardiac knowledge and practical clinical application, with a particular emphasis on education. I hope to increase accessibility to both veterinary students and practicing clinicians who may have limited access to specialized cardiac training. My previous research includes a case report involving a cat with an unusual form of heart failure and the use of imaging modalities to help guide different approaches in clinical practice.

 

How do vets study cocaine toxicosis in animals?

Most of our research on this topic has been done in an experimental, laboratory setting. Given the limited data that is available in canines, veterinary research has had to extrapolate data from human medicine. Unfortunately, these experimental scenarios oftentimes do not mimic real-life, clinical scenarios. As such, a limited number of reports exist. Another challenge regarding this topic is that pet owners may not know what their dog consumed off the floor at any time or may be afraid to admit what they know may have happened. This creates a clear gap between our controlled research and messy clinical reality.

 

Which symptoms did the chihuahua – the patient in the current study –present with, and how did you arrive at the diagnosis?

The dog initially arrived at the referring clinic with collapsing episodes and an abnormally low heart rate. Given previous suspected illicit substance ingestion, a point of care urine drug screen was performed which was positive for cocaine, this was later verified during another send-out test.

 

What steps can owners take to prevent their pets from accidentally ingesting such substances?

Dogs are natural scavengers and will investigate anything on the ground, so be vigilant during walks. Pet owners can prevent ingestion by keeping their dog on a leash, watching for any attempts to pick up or consume unknown items from the ground, and using a basket muzzle if necessary. Train your dog with ‘leave it’ and ‘drop it’ commands, which can be lifesaving if they encounter dangerous substances. If you suspect any exposure or feel as though your dog is not acting normally, it’s important to seek immediate veterinary attention - early intervention can be lifesaving.

 

In your opinion, why is your research important?

Case reports are crucial in veterinary medicine by providing real-world examples. They capture clinical scenarios that larger studies might miss, preserve unusual presentations for future reference, and help build our collective understanding of rare presentations, ultimately improving emergency preparedness and treatment protocols.

 

Are there any common misconceptions about this area of research? How would you address them?

A major misconception in toxicology research, particularly with illicit substances, is that veterinarians will judge owners or report them to authorities if they admit their pet may have ingested drugs. This fear prevents crucial information sharing that could save lives. Being honest about potential exposure can be lifesaving– the more information you can provide about what, when, and how much your pet may have consumed, the better we can tailor treatment and monitor for specific complications. What matters most is getting your pet immediate help, not assigning blame.

 

What are some of the areas of research you’d like to see tackled in the years ahead?

Veterinary medicine is growing at an incredible rate. We now have access to procedures like open heart surgery, kidney transplants, dialysis, and bronchial stents that were once thought of as far-fetched. However, the real challenge lies in making these life-saving advancements accessible and commonplace. We desperately need research focused on reducing barriers to entry and improving access to advanced care. This means developing studies on simplified imaging techniques that can be performed with standard equipment, creating cost-effective screening methods, and designing educational interventions that can translate specialized knowledge into practical tools for everyday veterinary practice. I'd also like to see research that addresses the economic realities of veterinary care - studies on alternative treatment protocols that achieve similar outcomes at lower costs, training programs in basic specialty procedures for general practitioners, and telemedicine applications for specialist consultations. The goal should be democratizing advanced veterinary care so that a dog in a rural clinic has access to similar life-saving treatments as one in a metropolitan specialty hospital. Only then will these incredible medical advances truly fulfill their potential to improve animal welfare on a broader scale.

 

How has open science benefited the reach and impact of your research?

Information is only useful to people if they have access to it. By eliminating barriers to entry, knowledge can reach a wider audience. My goal is to ensure peers have the knowledge base to practice facets of cardiology, even with limited resources.

 

Texas study reveals heat waves can cause more polluted air



American Chemical Society






WASHINGTON, Aug. 18, 2025 — Heat waves are becoming more common, severe and long-lasting. These prolonged periods of hot weather are especially dangerous in already hot places like Texas. In 2023, more than 300 people in Texas died from heat, according to the Texas Department of State Health Services, the most since the state began tracking such deaths in 1989. Researchers found it may not only be temperatures that make heat waves unsafe but also the heat-related increase in airborne pollutants. 

Bianca Pamela Aridjis-Olivos, a graduate student in aerosol and atmospheric chemistry at Texas A&M University will present her team’s results at the fall meeting of the American Chemical Society (ACS). ACS Fall 2025 is being held Aug. 17-21; it features about 9,000 presentations on a range of science topics.

In 2023, as intense heat raged across Texas, Aridjis-Olivos started working with faculty mentor Renyi Zhang and colleagues in the Center for Atmospheric Chemistry and the Environment (CACE) at Texas A&M. Because heat waves are predicted to worsen across the U.S. in the next few years, the CACE team was motivated to determine how extreme heat impacts atmospheric chemistry and air quality.

The researchers designed and conducted a pilot study of atmospheric chemistry during Texas’ August 2024 heat wave. They collected air samples throughout the day and night from Aug. 5 to Sept. 3 on the university campus in College Station, TX, where temperatures ranged from 90 to 106 degrees Fahrenheit (32 to 41 degrees Celsius). Additionally, the air sampling occurred when there were no wildfires nearby, isolating the effects of the heat wave itself without the influence of wildfire smoke on air quality.

The researchers analyzed air samples for pollutants of public health concern, including nitrogen oxides, ozone, volatile organic compounds (VOCs) and nanoparticles. For this work, they used a suite of sensitive instruments to detect trace gases and measure aerosol properties, including a proton transfer reaction time-of-flight mass spectrometer (PTR-ToF-4000).

“You can think of it like a super sensitive nose,” explained Aridjis-Olivos, adding that the PTR-ToF-4000 sniffs for VOCs then “gently marks them so they could be identified and weighed in real time.”

Results showed concerning levels of ozone, oxygenated VOCs and acid-rich nanoparticles that increased in concentration with outdoor temperatures. The researchers also observed increased levels of airborne pollutants created by sunlight-fueled chemical reactions in the air. Notably, they discovered that, during heat waves, trees release more natural VOC emissions, including isoprene, a precursor to ozone, that could be harmful in heavily wooded places, such as College Station.

“It really was surprising, how these emissions from trees increase during heat waves and interact with air pollution,” says Aridjis-Olivos, referring to the region’s ample oak trees. “On their own, the trees’ emissions aren’t dangerous. It’s when they react with other emissions under strong solar radiation that we get elevated ozone and secondary organic aerosols that are dangerous for the public health.”

The researchers are currently analyzing additional data from their August 2024 fieldwork. In the meantime, they share recommendations for keeping safe during heat waves:

  • Stay indoors during peak sunlight hours (typically noon to 4 p.m.), when temperatures and ozone levels are highest.
  • Avoid exercising or commuting near major roadways or urban hot spots during extreme heat.
  • Keep an eye on the local air quality index and adjust outdoor plans accordingly, when possible.
  • Keep windows closed to limit exposure to outdoor air pollutants.

While these tips can help in the short-term, the researchers say real progress depends on understanding how climate change affects atmospheric chemistry and air quality so scientists can better predict pollutant formation and protect public health.

The research was funded by Texas A&M University’s vice president of research, ASCEND seed grant initiative, and the Center for Atmospheric Chemistry and the Environment.

Visit the ACS Fall 2025 program to learn more about this presentation, “Measuring gas-phase air pollutants during heatwave conditions” and other science presentations.

###

The American Chemical Society (ACS) is a nonprofit organization founded in 1876 and chartered by the U.S. Congress. ACS is committed to improving all lives through the transforming power of chemistry. Its mission is to advance scientific knowledge, empower a global community and champion scientific integrity, and its vision is a world built on science. The Society is a global leader in promoting excellence in science education and providing access to chemistry-related information and research through its multiple research solutions, peer-reviewed journals, scientific conferences, e-books and weekly news periodical Chemical & Engineering News. ACS journals are among the most cited, most trusted and most read within the scientific literature; however, ACS itself does not conduct chemical research. As a leader in scientific information solutions, its CAS division partners with global innovators to accelerate breakthroughs by curating, connecting and analyzing the world’s scientific knowledge. ACS’ main offices are in Washington, D.C., and Columbus, Ohio. 

To automatically receive press releases from the American Chemical Society, contact newsroom@acs.org

Note to journalists: Please report that this research was presented at a meeting of the American Chemical Society. ACS does not conduct research, but publishes and publicizes peer-reviewed scientific studies.

Follow us: Facebook | LinkedIn | Instagram 

Title
Measuring gas-phase air pollutants during heatwave conditions

Abstract
Climate change and air pollution are two interconnecting environmental crises that confront humankind in the 21st century and are closely tied to the variations in atmospheric compositions. Since pre-industrial times, rising greenhouse gases and aerosol levels have disrupted the Earth’s radiative balance, altered geochemical cycles, and impacted air quality and health. However, gas-to-particle conversion and changes in gas-phase composition remain poorly understood, as these interconnected atmospheric physicochemical processes continue to challenge efforts to predict future climate. The record-breaking 2023 heat waves in the southern U.S. highlight the urgent need to understand how extreme heat affects atmospheric chemistry, particularly NOx, O3, volatile organic compounds (VOCs), and secondary organic aerosols (SOAs). Recognizing this critical gap, a pilot field campaign program was developed and conducted in August 2024 in College Station, Texas, bringing together multiple research groups from the Center for Atmospheric Chemistry and the Environment (CACE) at Texas A&M. An Ionicon proton transfer reaction time-of-flight mass spectrometer (PTR-ToF-4000), equipped with selective reagent ion sources (SRI), enables detailed gas-phase analysis of VOCs and their aerosol formation potential under heatwave conditions. Additionally, concurrent measurements of O3 and NOx provide insight into photochemical processes under extreme heat conditions, both of which are discussed in this presentation. This study aims to improve air quality management strategies and inform public health responses by deepening our understanding of the interactions between heatwaves, VOC emissions, and pollutant formation, ultimately contributing to enhanced preparedness and mitigation efforts.

 

A potential ‘green’ alternative to formaldehyde and PFAS in fabric finishing




American Chemical Society

A potential ‘green’ alternative to formaldehyde and PFAS in fabric finishing 

image: 

Untreated cotton fabric absorbs water droplets (top) and cotton fabric treated with a form of cottonseed oil repels water (bottom).

view more 

Credit: Richard Venditti




WASHINGTON, Aug. 18, 2025 — More than half of the 7.5 million bales of cotton produced annually in the U.S. will be used in clothing manufacturing. The finishing techniques used to make cotton fabric smooth, water-repellant and resistant to wrinkling can be detrimental to the environment and the wearer. Now, researchers propose a method for using cottonseed oil as a “greener” and safer alternative to formaldehyde and per- and polyfluoroalkyl substances (PFAS), often called forever chemicals, when finishing cotton fabrics.

Taylor Kanipe, a graduate student at North Carolina State University (NC State), will present her results at the fall meeting of the American Chemical Society (ACS). ACS Fall 2025 is being held Aug. 17-21; it features more than 9,000 presentations on a range of science topics.

The process for harvesting cotton and creating fabric for textiles includes collecting the wispy cellulose fibers of the cotton boll, removing the cotton seeds interspersed in the fibers, spinning the cotton into yarn, weaving the yarn into fabric and then finishing the fabric with a variety of chemicals that alter its physical properties — for example, making it softer or wrinkle resistant.

Formaldehyde-based resins have traditionally been used as a fabric finishing agent. The sticky resin easily binds to cotton’s cellulose fibers, forming chemical bridges to make the long cellulose fibers resistant to wrinkling or stretching. While formaldehyde is cheap, easy to use and highly reactive, at high concentrations it is considered a Class 1 carcinogen. Formaldehyde can also cause skin and respiratory irritations. Fluorine-containing water repellant coatings create a hydrophobic surface to make cotton fabric water resistant. However, these coatings contain PFAS, such as perfluorooctane sulfonate and perfluorooctanoic acid, and are being phased out due to their persistent nature and potential link to health conditions.

To eliminate the need for formaldehyde-based resins and PFAS in cotton fabric finishing, a group led by Richard Venditti, a professor of forest biomaterials, paper science and engineering at NC State, set out to create a green alternative by chemically altering seed oil from the cotton plant itself. Drawing on previous research at NC State, Kanipe, Venditti and colleagues took advantage of specific chemical properties in cottonseed oil to insert epoxy groups along the long carbon chains that make up the oil molecules. The epoxide group allows epoxidized cottonseed oil (ECSO) molecules to create strong chemical bonds with the cellulose fibers in cotton fabric and with each other, forming a polymer and making the fabric hydrophobic. The epoxy groups also create oil molecule bridges between the cellulose fibers, making the fabric resistant to wrinkling. 

In addition to fabric finishing, ECSO could provide a use for the cottonseed oil harvested along with the cotton fibers, making it as inexpensive, easy to use and effective as formaldehyde resins.

“Epoxidized vegetable oils have a range of applications,” Kanipe explains. “While native cottonseed oil lacks the reactivity of formaldehyde-based resins, this simple epoxidation process produces a safer, more user-friendly alternative for applications like durable press finishes.”

The researchers weighed and chemically analyzed the ECSO-treated fabric using a type of infrared spectroscopy to ensure the ECSO molecules had successfully bonded to the fabric’s surface. To evaluate the finished fabric’s water repellent qualities, the researchers used a high-speed camera to measure the contact angle at which water droplets would interact with the cotton surface. The larger the angle between the water droplet and the surface of the fabric, the greater the water resistance. Untreated fabric showed no contact angle (in other words, the water was fully absorbed into the fabric), while ECSO-treated fabric showed a contact angle of 125 degrees, indicating a significant increase in water-repelling ability.

Future studies will measure additional performance factors in ECSO-treated cotton fabric, including tear strength, durability and wrinkle resistance. The team’s ultimate goal is to create a process of treating cotton with an ECSO water emulsion, a green process that does not require hazardous finishing substances.

“If we can achieve our goal of changing the properties of the cotton fabric — making it anti-wrinkle, anti-staining and water-resistant — using a water-based process, we’ll have a green process for putting a bio-based material onto cotton as a replacement for formaldehyde- and PFAS-based finishes,” says Venditti.

This research was funded by Cotton Incorporated and an Agriculture and Food Research Initiative from the U.S. Department of Agriculture’s National Institute of Food and Agriculture.

Visit the ACS Fall 2025 program to learn more about this presentation, “Sustainable cotton fabric finishing: epoxidized cottonseed oil as a bio-based alternative to formaldehyde-based treatments” and other science presentations.


Title
Sustainable cotton fabric finishing: epoxidized cottonseed oil as a bio-based alternative to formaldehyde-based treatments

Abstract
The textile industry is moving away from formaldehyde-based durable press finishes due to regulatory and environmental concerns, creating a need for sustainable alternatives. Cotton fabric, widely used in apparel and home textiles, benefits from treatments that enhance durability, wrinkle resistance, and hydrophobicity. However, conventional finishing agents often rely on synthetic chemicals, which pose potential health and environmental risks. This study explores the use of epoxidized cottonseed oil (ECSO) as a bio-based alternative for modifying cotton fabric and cellulose-based materials. Cotton fabric and filter paper were grafted with ECSO using a solvent-based system, varying catalyst concentrations, reaction times, and ECSO loadings. ATR-FTIR analysis confirmed successful modification through the presence of carbonyl peaks (1740 cm-1), while water contact angle (WCA) measurements increased from 0° to 125°, demonstrating significant hydrophobicity enhancement. Additionally, stable ECSO oil-in-water emulsions were developed using green emulsifiers and applied to cotton fabric via a pad-dry-cure process to assess their performance as fabric softeners and wrinkle-resistant finishes. These results highlight the potential of ECSO as a sustainable textile treatment, offering improved fabric performance without the use of formaldehyde-based chemistry. Continued refinement of curing conditions, including time, temperature, and catalyst loading, seeks to improve treatment effectiveness. Future research will assess additional performance factors, such as tear strength, smoothness, wrinkle resistance, and durability, to validate the feasibility of ECSO-based finishes for industrial use.

 

Nuclear waste could be a source of fuel in future reactors



American Chemical Society





WASHINGTON, Aug. 18, 2025 — From electric cars to artificial intelligence (AI) data centers, the technologies people use every day require a growing need for electricity. In theory, nuclear fusion — a process that fuses atoms together, releasing heat to turn generators — could provide vast energy supplies with minimal emissions. But nuclear fusion is an expensive prospect because one of its main fuels is a rare version of hydrogen called tritium. Now, researchers are developing new systems to use nuclear waste to make tritium.

Terence Tarnowsky, a physicist at Los Almos National Laboratory (LANL), will present his results at the fall meeting of the American Chemical Society (ACS). ACS Fall 2025 is being held Aug. 17-21; it features about 9,000 presentations on a range of science topics.

Today’s nuclear power plants generate energy through a process called nuclear fission. During nuclear fission, a plutonium or uranium atom splits to release energy and particles called neutrons, which go on to split more atoms. This fission chain reaction provides a steady stream of energy but also results in long-lived nuclear waste. Proposed nuclear fusion power plants would generate energy by combining atomic nuclei. With fusion, forms of hydrogen, called deuterium and tritium, would join to create heavier atoms. This process, which powers stars in the universe, releases a large amount of energy and, unlike fission, produces very little radioactive waste.  

While deuterium is readily available, the U.S. currently lacks a secure and predictable supply of tritium. “Right now, the value of commercial tritium is about $15 million per pound [$33 million per kilogram], and the U.S. doesn’t have any domestic capability to create it,” says Tarnowsky. “So, we have this tritium supply shortage.”

Tritium occurs naturally in the upper atmosphere. And the current major commercial source is fission reactors in Canada. “The total tritium inventory on the planet is about 55 plus or minus 31 pounds [25 plus or minus 14 kilograms],” says Tarnowsky. “Making some assumptions, 55 pounds [25 kilograms] is enough tritium to power more than 500,000 homes for six months. This is more than the residential units in Washington, D.C.”

Unlike its stores of tritium, the U.S. has thousands of tons of nuclear waste produced by commercial nuclear power plants. It contains highly radioactive materials which require expensive storage to keep it safely contained. Long-term storage raises concerns about radiation leaks into the environment with the potential to harm plants and wildlife, or cause cancer in humans.

So, Tarnowsky saw an opportunity to assess the feasibility of using still-radioactive nuclear waste to generate valuable tritium. He has conducted multiple computer simulations of potential tritium reactors to evaluate the designs’ production and energy efficiency.

The simulated reactor designs use a particle accelerator to jump-start atom-splitting reactions in the nuclear waste. As atoms divide in the simulation, they release neutrons and ultimately produce tritium after a series of other nuclear transitions. The accelerator feature would allow operators to turn these reactions on or off and is considered safer than the chain reactions that take place in a typical nuclear power plant. Although the basic principles of the design are not new, advances in technology could make it more efficient than when it was first considered in the 1990s and early 2000s, says Tarnowsky.

So far, he estimates that this theoretical system running on 1 gigawatt of energy, or the total annual energy needs of 800,000 U.S. homes, could produce about 4.4 pounds (2 kilograms) of tritium per year. This amount is on par with the total yearly output from all reactors in Canada. A key advantage to Tarnowsky’s system would be the efficiency of tritium production. He projects that the design would produce more than 10 times as much tritium as a fusion reactor at the same thermal power.

Next, Tarnowsky will generate a dollar cost for tritium production once he has more sophisticated calculations of the reactor’s efficiency. He’ll refine his simulations to more precisely evaluate the efficiency and safety of the reactor’s design, most of which have been previously engineered but not yet combined in this way. For example, he plans to develop new code for a model that surrounds the nuclear waste with molten lithium salt, an established design for reactors with uranium fuel that has only been used for scientific experiments. The salt’s cooling properties offer a potential safety measure, and the setup would make it difficult to extract the waste for weapons development. The ultimate goal is for the modeling to help decision-makers understand which simulation has the most potential for future implementation.

All of this might seem complex, but to Tarnowsky it’s all part of a plan to use existing technology to lower costs. “Energy transitions are a costly business, and anytime you can make it easier, we should try,” he says.

The research was funded by Los Alamos National Laboratory and the National Nuclear Security Administration.

Visit the ACS Fall 2025 program to learn more about this presentation, “On-ramping the fusion economy with kilogram quantities of commercial tritium” and other science presentations.


Note to journalists: Please report that this research was presented at a meeting of the American Chemical Society. ACS does not conduct research, but publishes and publicizes peer-reviewed scientific studies.

Follow us: Facebook | LinkedIn | Instagram 

Title
On-ramping the fusion economy with kilogram quantities of commercial tritium

Abstract
For many reasons, the US has no commercial, domestic tritium production capabilities. The value (2024 $) of commercial tritium is ~$33,000,000/kg [1]. A 1 GWth D-T fusion energy plant full power year (FPY) will need more than 55 kgs of tritium/year. These power plants are hoping to breed tritium during operation and the required Tritium Breeding Ratio (TBR) to feed back to the fusion reactor must be > 1.0 (ideally, 1.1 – 1.2). Small uncertainties (~1%) in system TBRs can still lead to changes of over +/- 500 g = per FPY at 1 GWth [2]. Starting a fusion plant with no tritium (using D-D reactions to breed tritium) is not economically viable [3,4].

Currently, commercial tritium supplies are produced in heavy-water reactors like the 600 MW, Canada Deuterium Uranium (CANDU) at rates of 0.1 kg / yr.

We propose to investigate the design, development, performance requirements and cost of an accelerator-driven system (ADS) using molten salt (MS) technology as the working material for transmuting used reactor fuel and producing a supply of commercial tritium.

Recycling and transmuting used nuclear fuel (UNF) in an ADS satisfies multiple needs: 1) Long-lived transuranic material is destroyed, thereby improving the acceptance of a UNF repository, 2) Energy is produced by fission (offsetting the power used by the accelerator), and 3) The system is operated in a sub-critical configuration, which improves safety while minimizing criticality constraints.

This ADS+MS concept is well-suited for a commercial tritium production mission and the US Department of Energy has the requisite experience with handling, processing, storing, and transporting the products.

An ADS+MS facility can achieve TBRs > 20 with current technology, provide kg quantities of tritium annually, decrease the overall cost of construction and operations at a fusion power plant, and lower proliferation risks.

LA-UR-24-33273