Wednesday, June 25, 2025

Screen saver: Simpler, less costly virus testing in high-risk settings 





Yale University

 

New Haven, Conn. — The COVID-19 pandemic yielded important advances in testing for respiratory viruses, but it also exposed important unmet needs in screening to prevent the spread of infections in high-risk settings.

 

While PCR (polymerase chain reaction) tests are the gold standard for detecting viral infections, they remain a challenge for screening large numbers of people in places vulnerable to outbreak — such as health care centers and nursing homes — due to high costs and the fact that different tests are required for each virus.

 

A new Yale study, however, finds that an alternate strategy — using a nasal swab to screen for an antiviral protein produced by the body as a defense against infection — can be an effective method for ruling out respiratory infections, limiting PCR testing only to those most likely to be infected, at a fraction of the cost. 

 

The study was published online on June 20 in The Lancet eBiomedicine

 

By comparing the results from both PCR tests and screenings for the immunological biomarker in more than 1,000 people (including children and adults, some who had a viral infection, and some who did not), the researchers found that using the less-costly screening test would enable a 10-fold or greater reduction in the need for PCR tests.

 

This new approach, the findings suggest, may offer a valuable strategy for managing outbreaks and for routine screenings to prevent virus transmission in high-risk settings, said Ellen F. Foxman, an associate professor of laboratory medicine and immunobiology at Yale School of Medicine (YSM) and senior author of the study. 

 

“If you wanted to screen every person who walked through the door for every respiratory virus, it would be difficult and prohibitively expensive,” said Foxman. “But we also know that in some settings with close quarters — like military settings, health care and rehab facilities, or assisted living centers — an infected person who might be asymptomatic can spread the infection to others, including people at high risk from getting very sick from the virus.

 

“So the question was, how do you triage testing in these settings? Most people aren’t going to have viruses, so you don’t want to do sophisticated, expensive tests for them. Ideally you want a simple, inexpensive test that allows you to say, ‘OK, you don’t have a virus, you can go on your way,’ and then focus further testing on the people who need it.” 

 

First authors of the study were Julien Amat, a postdoctoral associate in Foxman’s lab, and Sarah Dudgeon, a Ph.D. student in the research group of Wade Schulz, an associate professor of laboratory medicine at Yale. The study also involved a team of other collaborators in Yale’s Department of Laboratory Medicine.

 

For the study, the researchers drew on an observation made by Foxman’s lab in 2017: Nasal swabs taken from patients with suspected respiratory infections revealed that among those testing positive for viruses with PCR tests, antiviral defenses had been activated. This, they concluded, might be useful to indicate the presence of a virus.  

 

In the current study, they focused on the protein CXCL10 (a cytokine produced in the nasal passage in response to multiple respiratory viruses) using 1,088 swab samples collected from COVID-19 screening tests or clinical settings in Yale New Haven Hospital. They then performed a series of analyses — including a comparison with test results from PCR tests conducted using the same swabs, and mathematical modeling to predict resource savings from using the test in different scenarios.

 

The biomarker performed well in predicting viral infections regardless of patient age, sex, or other demographics, Foxman said. 

 

What’s more, the research team utilized electronic records to evaluate those instances when the results from biomarker screening and the PCR tests did not align (when, for instance, the biomarker test was negative, and the PCR test was positive), and discovered factors that might contribute to the mismatches. 

 

For example, the biomarker screen was less likely to identify a viral infection in individuals who were taking certain immunosuppressive cancer drugs. Likewise, they found that among individuals who’d tested positive for viral infections with PCR tests, but for whom the viral loads were so low that they were barely detectable, the biomarker screening was less likely to produce a positive test.

 

Overall, the researchers found that the biomarker method, if used to screen populations with low virus prevalence (for instance, hospital staff workers exhibiting no symptoms of infection), screening would reduce the need for 92% of PCR tests. 

 

“So you could do less than one-tenth of the PCR tests and still identify the people with viral infections,” Foxman said. “This strategy could have been useful during the early COVID-19 pandemic, when our hospital was sometimes testing 10,000 or more people per week.” The researchers estimate that the cost of commercial PCR tests are roughly five to 15 times greater than the cost of biomarker tests.

 

“It’s exciting to consider the possible applications,” Foxman said. 

 

The study was funded by the inaugural competition of the Yale-New Haven Health System (YNHHS) Innovation Award, an initiative that supports YNHHS employees and Yale faculty working on promising ideas with potential for health care impact, and the National Institutes of Health. 

 

Other authors of the study are Julien A. R. Amat, a postdoctoral associate at YSM and member of the Foxman lab; Sarah N. Dudgeon, a Ph.D. candidate in the Schulz research group; Nagarjuna R. Cheemarla, a former postdoctoral associate at YSM; Timothy A. Watkins, a graduate student and member of the Foxman lab; Alex B. Green, a former resident at Yale New Haven Hospital who is now a fellow at the Children’s Hospital of Philadelphia; H. Patrick Young, an associate research scientist in the Department of Laboratory Medicine; David R. Peaper, an associate professor of laboratory medicine at YSM; Marie L. Landry, professor of laboratory medicine and of medicine (infectious diseases) at YSM; and Wade Schulz, an associate professor of laboratory medicine at YSM.

 

 

Scientists discover evidence of extensive human fire use 50,000 years ago


CREATING WILDFIRES FOR ALL THAT TIME



Chinese Academy of Sciences Headquarters
Fire history of Europe, East Asia, Southeast Asia and Papua New Guinea-Australia and age distribution of archaeological sites since the last 300,000 years 

image: 

Fire history of Europe, East Asia, Southeast Asia and Papua New Guinea-Australia and age distribution of archaeological sites since the last 300,000 years.

view more 

Credit: Image by IOCAS







Fire is an ancient natural phenomenon that has shaped our planet for over 400 million years. It alters habitats, affects the carbon cycle, and is closely linked to changes in climate and vegetation.

With the arrival of humans, fire began to transition from a purely natural force to one that could be harnessed to achieve specific goals. However, it has long been unclear when humans started using fire extensively to regulate their lives and significantly influence fire occurrences.

To address this question, researchers from the Institute of Oceanology of the Chinese Academy of Sciences (IOCAS), alongside collaborators from China, Germany, and France, analyzed the pyrogenic carbon record in a 300,000-year-old sediment core from the East China Sea.

"Our findings challenge the widely held belief that humans only began influencing environment with fire in the recent past, during the Holocene," said Dr. ZHAO Debo, the study's corresponding author.

This study, published in the Proceedings of the National Academy of Sciences (PNAS), highlights the presence of charred plant remains—known as pyrogenic carbon—formed when vegetation burns but is not completely consumed by fire. The research reveals a notable increase in fire activity across East Asia approximately 50,000 years ago. This finding aligns with earlier reports of heightened fire activities in Europe, Southeast Asia, and the Papua New Guinea–Australia region respectively, suggesting a continental-scale intensification of fire use during this period.

Paleoanthropologists who endorse the theory of evolution suggest that the common ancestors of all modern humans originated in Africa around 300,000 years ago, with Homo sapiens first emerging during this period. Between 70,000 and 50,000 years ago, Homo sapiens migrated from Africa to Europe, Asia, Southeast Asia, and Australia, eventually replacing local ancient human populations.

The study highlights that this global rise in fire use coincides with the rapid spread of Homo sapiens, increasing population densities, and a greater reliance on fire, particularly amid cold, glacial conditions. During this period, fire not only facilitated cooking—allowing for more efficient absorption of nutrients from food—but also provided protection against predators and helped humans survive in extreme climates. This reliance on fire contributed to cultural advancements, technological innovations, and a significant impact on natural systems, especially the carbon cycle.

Humans likely began shaping ecosystems and the global carbon cycle through their use of fire even before the Last Ice Age. "Even during the Last Glaciation, the use of fire had probably started to reshape ecosystems and carbon fluxes," added Prof. WAN Shiming, another corresponding author. 

These conclusions have significant implications for understanding Earth's sensitivity to human impacts. If human fire management altered atmospheric carbon levels tens of thousands of years ago, current climate models may underestimate the historical baseline of human–environment interactions.

WOTD

Study of ancient rocks helps predict potential for future marine anoxia


WORD OF THE DAY



Chinese Academy of Sciences Headquarters
Paleozoic marine biodiversity, atmospheric composition, and seafloor oxygenation history 

image: 

Paleozoic marine biodiversity, atmospheric composition, and seafloor oxygenation history.

view more 

Credit: Image by Prof. CHEN Jitao's team





Earth's current climate is considered an "icehouse climate" due to the existence of polar ice caps. This is important because previous icehouse climates can better predict how atmospheric oxygen and carbon dioxide (CO2) levels today may affect the risk of marine anoxia and subsequent marine biodiversity loss in the future.

To understand the interplay among atmospheric oxygen and CO2 levels and oxygenation conditions in the ocean during an earlier icehouse climate, an international team led by Prof. CHEN Jitao from the Nanjing Institute of Geology and Palaeontology of the Chinese Academy of Sciences studied ancient sedimentary rocks in Naqing, South China, to analyze their chemical compositions.

Specifically, the researchers derived high temporal-resolution records of carbonate uranium isotopes from a marine carbonate slope succession dating from the late Carboniferous to early Permian (310–290 million years ago). This geologic epoch is part of the Late Paleozoic Ice Age (LPIA) (360–260 million years ago), which is recognized as the longest icehouse climate since advanced plants and terrestrial ecosystems appeared.

By combining these records with previously published carbonate carbon isotopes, paleo-CO2 data, and records of volcanic activity and plant evolution, the researchers quantitatively explored, through biogeochemical modeling, the global carbon cycle and marine oxygen conditions for this geological period. This work was published in the Proceedings of the National Academy of Sciences of the United States of America (PNAS).

The study revealed rapid drops in levels of carbonate uranium isotopes, which occurred alongside rapid increases in atmospheric CO2 levels. This suggests that seafloor anoxia expanded even during the Phanerozoic maximum of atmospheric oxygen and the glacial peak of the LPIA.

Using a carbon–phosphorus–uranium (C-P-U) biogeochemical model coupled with Bayesian inversion, the researchers quantitatively examined the interactions among marine anoxia, carbon cycling, and climate evolution during this paleo-glacial period. Model results indicated that enhanced burial of marine organic carbon likely drove the overall decline in atmospheric CO2 and the rise in oxygen levels in the atmosphere–ocean system throughout this interval. However, despite these high oxygen levels, episodic massive carbon emissions could have triggered recurrent global warming and seafloor deoxygenation.

Furthermore, the team's model showed an increase of 4–12% in the extent of the anoxic seafloor, which could have led to a pause or decline in marine biodiversity. This study emphasizes that under current icehouse conditions, which mirror the high-oxygen state of the LPIA, ongoing warming may still provoke widespread ocean anoxia.

This study advances our understanding of the processes and feedback mechanisms within the Earth system during icehouse conditions, enabling more accurate projections of the future trajectory of current global warming and marine deoxygenation.

 

UTIA researcher to receive award from the Soil and Water Conservation Society


Sindhu Jagadamma to be recognized for her research on soil health and management


DEI IS MERIT (BASED)


University of Tennessee Institute of Agriculture

Sindhu Jagadamma, associate professor of soil science at the University of Tennessee Institute of Agriculture 

image: 

The Soil and Water Conservation Society will present their 2025 Conservation Research Award to Sindhu Jagadamma, associate professor of soil science at the University of Tennessee Institute of Agriculture, at the society’s annual conference in August. 

view more 

Credit: Photo of Jagadamma by H. Harbin, courtesy UTIA.





Sindhu Jagadamma, associate professor of soil science at the University of Tennessee Institute of Agriculture, will receive the Soil and Water Conservation Society’s 2025 Conservation Research Award at the society’s annual conference in August.

Soil health is critical for sustainable food production, and Jagadamma’s research in her Sustainable Soil Management Lab are developing ways to mitigate soil and environmental problems associated with conventional farm management practices. Her team studies how to maintain soil health through the implementation of conservation management, such as use of organic soil amendments and practices like no-till, cover crops and crop rotation.

The Conservation Research Award recognizes Soil and Water Conservation Society members or teams of members whose research has led to exceptional improvements in soil conservation, water conservation and related natural resources research. The members of this international organization include researchers, administrators, Extension specialists, students, producers, policymakers, U.S. Department of Agriculture employees, educators and more.

“This prestigious national award is a strong testament and recognition of Sindhu’s remarkable research accomplishments and impacts among her professional peers,” said UT AgResearch Dean Hongwei Xin.

Jagadamma also collaborates with scientists at Oak Ridge National Laboratory, focusing on understanding the response of ecosystem carbon and nutrient cycles to abiotic and biotic changes. Her team uses stable carbon and nitrogen isotope techniques and finer-scale instrumentations in innovative lab and field experiments. She has secured more than $70 million in total grants, including 10 USDA-NIFA awards for which she is principal investigator or a named Co-PI.

Jagadamma’s degrees are all in the studies of soil science and agronomy. She received her B.S. in agricultural sciences from Kerala Agricultural University in India and her M.S. and Ph.D. degrees from The Ohio State University. She began her work at the University of Tennessee as a postdoctoral researcher in the College of Engineering in 2014 and then moved to UTIA as an assistant professor in 2016.

“I dedicate this award to my incredible lab members, whose drive for excellence and hard work made this achievement possible. I am also deeply grateful for the continued support from UT AgResearch and the Department of Biosystems Engineering and Soil Science for the growth of my research program,” she said.

The University of Tennessee Institute of Agriculture is comprised of the Herbert College of Agriculture, UT College of Veterinary Medicine, UT AgResearch and UT Extension. Through its land-grant mission of teaching, research and outreach, the Institute touches lives and provides Real. Life. Solutions. to Tennesseans and beyond. utia.tennessee.edu.

 

Concerned father, statistician develops software to improve skills therapy



Mark Ramos, assistant research professor of health policy and administration at Penn State, created a software tool for use in therapy for autism spectrum disorder and other developmental delays




Penn State






UNIVERSITY PARK, Pa. — Mabel Ramos’s favorite song is “Ghostbusters” by Ray Parker Junior. From morning until night, if Mabel is awake, she is listening and dancing to — or asking to listen to — the number one Billboard hit from 1984. Though some parents might be annoyed by listening to a single song repeatedly, her father, Mark Ramos, said he is delighted by his daughter’s ability to dance, communicate and express her enthusiasm.

Mabel, who is five years old, has autism spectrum disorder. Mark, assistant research professor of health policy and administration at Penn State, said that she is reaching developmental milestones that are typical for a two- or three-year-old child. When she was two years old, her development was almost stagnant — she could not speak or sit up by herself. According to Mark, Mabel’s increased development is largely due to the therapies she receives.

When his daughter was in therapy, Mark discovered that — for a specific type of treatment — statistical methods could be used to help measure whether children were meeting developmental milestones. Mark wrote software code that therapists can use to choose thresholds for verifying that children with autism spectrum disorder and other developmental disabilities are mastering skills. The free software and its application were described in a recent article in Behavior Analysis in Practice.

Therapies for autism spectrum disorder

“When you take your child to their two-year pediatric appointment, the doctor hands you a checklist where you indicate the milestones they are reaching,” Mark explained. “When my wife and I went through the list, we realized that Mabel wasn’t developing her communication or movement abilities at all.”

Shortly after that appointment, Mabel began physical therapy, occupational therapy and Applied Behavior Analysis, which is commonly used to help people with autism spectrum disorder build specific skills. When Mark, who is a trained statistician, visited his daughter’s treatment facility, he was fascinated by Mabel’s treatment, especially a component of her therapy called discrete trial training (DTT).

Discrete trial training — learning step by step

In DTT, every task is broken down into its component parts. The participant performs each component repeatedly until they master it, and only then do they move on to the next component.

Most children learn how to wash their hands as one process from approaching the sink to drying their hands, for example, but this may not work for children with autism spectrum disorder.

Taught via DTT, however, hand washing might be broken down into sequential components, including: walk to the sink, turn on the water, wet hands, put soap on hands, rub hands, rub hands under water, turn off water and dry hands. The child would perform a single component of the task a prespecified number of times and reach a certain level of success — like successfully walking to the sink eight out of 10 times — before they would be taught the next component of turning on the water.

“I immediately loved DTT,” Mark said. “For one thing, I could see it was helping Mabel learn. But also, it was very scientific and systematic. DTT generates a lot of data that provides objective measures of progress. As a statistician, that was very exciting for me. But I noticed an issue between the performance thresholds and the level of mastery that the children had actually demonstrated.”

Performance is not mastery

A performance criterion is a specific score — a single data point, according to Mark. For example, it is a measure of whether a child was able to reach 80% on a specific trial of a task. Mastery, on the other hand, is a question of probability — a prediction of how often a child will be able to complete the task at any point in the future.

Mark learned that when children needed to master a skill with 80% success, they were typically expected to complete four out of five or eight out of 10 trials successfully. But as a trained statistician, Mark knew that probability of mastery is not equivalent to performance criteria and that the number of trials used mattered considerably.

“If a child performed 80% on a task, they will not necessarily be 80% successful each time; their actual projected mastery level would be a little lower,” Mark said. “Fortunately, there are basic statistical procedures that can estimate what level should be set as the performance criteria so students can perform a task to a specified level of mastery.”

Free software for therapists

Mark created a freely available software called Measurement of Individualized, Evidence‑Based Learning (MIEBL). In MIEBL, users — ideally, the clinicians who run DTT for children with developmental disabilities — enter the performance criterion and the number of items in the trial.

MIEBL employs Bayesian estimates — probabilities based on predicted performance and updated with observed data. For example, if the performance criterion is set at 80% on 10 items, the average mastery for students reaching this mark will be 77.27%. This means that children who reach 80% on the 10-item trial can only be expected to succeed at the same task 77.27% of the time in the future. So, if 80% mastery really matters, the performance criterion should be set to 90% of 10 items. Children who achieve 90% would be expected to successfully complete the task at least 80% of the time.

Performance criteria are already carefully set in DTT, Mark said. Typically, if a skill is important, but not critical — like color identification — the performance criteria will be set at 80%. For critical skills — like safely crossing the street — the performance criteria will be set at 100%.

According to Mark, the intention of MIEBL is to enable therapists to know exactly what level of mastery they can expect from participants for any given performance criteria and number of trials.

Mark recently shared the software with Mabel’s therapists and said he hopes they — and other therapists — will begin to use it routinely.

The future for Mabel and for Mark’s software

“To be clear, I do not think there are huge gaps in the DTT process,” Mark said. “I simply wanted to create an easy tool to let therapists verify whether children are meeting the targets they think they are hitting. If people use this tool and tweak their standards on certain tasks or simply verify that their current performance criteria are correct, that would be a great outcome.”

Though the project was very different than his typical work as a researcher in the Department of Health Policy and Administration, Mark said he loved working on this project because he believes the approach could eventually help his daughter or other children who need support.

“DTT is helping my daughter grow into the fun, capable, music-loving child she is becoming,” Mark said. 

And what will Mabel be focused on?

“She just loves to dance,” Mark said. “And fortunately, she changes her favorite song every week or two.”