Tuesday, June 08, 2021

#BIOPHAGES

Trained viruses prove more effective at fighting antibiotic resistance

Practice boosts phage potency in evolutionary battleground with deadly bacteria

UNIVERSITY OF CALIFORNIA - SAN DIEGO

Research News

The threat of antibiotic resistance rises as bacteria continue to evolve to foil even the most powerful modern drug treatments. By 2050, antibiotic resistant-bacteria threaten to claim more than 10 million lives as existing therapies prove ineffective.

Bacteriophage, or "phage," have become a new source of hope against growing antibiotic resistance. Ignored for decades by western science, phages have become the subject of increasing research attention due to their capability to infect and kill bacterial threats.

A new project led by University of California San Diego Biological Sciences graduate student Joshua Borin, a member of Associate Professor Justin Meyer's laboratory, has provided evidence that phages that undergo special evolutionary training increase their capacity to subdue bacteria. Like a boxer in training ahead of a title bout, pre-trained phages demonstrated they could delay the onset of bacterial resistance.

The study, which included contributions from researchers at the University of Haifa in Israel and the University of Texas at Austin, is published June 8 in the Proceedings of the National Academy of Sciences.

"Antibiotic resistance is inherently an evolutionary problem, so this paper describes a possible new solution as we run out of antibiotic drug options," said Borin. "Using bacterial viruses that can adapt and evolve to the host bacteria that we want them to infect and kill is an old idea that is being revived. It's the idea of the enemy of our enemy is our friend."

The idea of using phages to combat bacterial infections goes back to the days prior to World War II. But as antibiotic drugs became the leading treatment for bacterial infections, phage research for therapeutic potential was largely forgotten. That mindset has changed in recent years as deadly bacteria continue to evolve to render many modern drugs ineffective.

Borin's project was designed to train specialized phage to fight bacteria before they encounter their ultimate bacterial target. The study, conducted in laboratory flasks, demonstrated classic evolutionary and adaptational mechanisms at play. The bacteria, Meyer said, predictably moved to counter the phage attack. The difference was in preparation. Phages trained for 28 days, the study showed, were able to suppress bacteria 1,000 times more effectively and three- to eight-times longer than untrained phage.

"The trained phage had already experienced ways that the bacteria would try to dodge it," said Meyer. "It had 'learned' in a genetic sense. It had already evolved mutations to help it counteract those moves that the bacteria were taking. We are using phage's own improvement algorithm, evolution by natural selection, to regain its therapeutic potential and solve the problem of bacteria evolving resistance to yet another therapy."

The researchers are now extending their findings to research how pre-trained phages perform on bacteria important in clinical settings, such as E. coli. They are also working to evaluate how well training methods work in animal models.

UC San Diego is a leader in phage research and clinical applications. In 2018 the university's School of Medicine established the Center for Innovative Phage Applications and Therapeutics, the first dedicated phage therapy center in North America.

"We have prioritized antibiotics since they were developed and now that they are becoming less and less useful people are looking back to phage to use as therapeutics," said Meyer. "More of us are looking into actually running the experiments necessary to understand the types of procedures and processes that can improve phage therapeutics."

###

The study's full author list includes: Joshua Borin, Sarit Avrani, Jeffrey Barrick, Katherine Petrie and Justin Meyer.



CAPTION

Trained and untrained phages are pitted against bacteria in battleground flasks to evaluate which is more effective at killing.

 

Sensing what plants sense: Integrated framework helps scientists explain biology and predict crop performance

IOWA STATE UNIVERSITY

Research News

AMES, Iowa - Scientists have invested great time and effort into making connections between a plant's genotype, or its genetic makeup, and its phenotype, or the plant's observable traits. Understanding a plant's genome helps plant biologists predict how that plant will perform in the real world, which can be useful for breeding crop varieties that will produce high yields or resist stress.

But environmental conditions play a role as well. Plants with the same genotype will perform differently when grown in different environments. A new study led by an Iowa State University scientist uses advanced data analytics to help scientists understand how the environment interacts with genomics in corn, wheat and oats. The results could lead to more accurate and faster models that will allow plant breeders to develop crop varieties with desirable traits.

The study was published recently in the peer-reviewed academic journal Molecular Plant.

Jianming Yu, a professor of agronomy and the Pioneer Distinguished Chair in Maize Breeding, said the study sheds light on phenotypic plasticity, or the ability of crops to adapt to environmental changes. This could help plant breeders get a better understanding of how "shapable" plant species are, or how much potential they have to perform well in different environments.

"We knew that genetic performance is context dependent. It's not static; It's dependent on environmental conditions," said Xianran Li, an adjunct associate professor and the first author of the study. "Two alleles of a gene perform differently in one environment but the same in another. What is challenging is to understand the interplay between genes and the environment under the natural field conditions. The obvious obstacle is that natural environments are much more complex than controlled laboratory conditions. How can we detect the major signals plants perceive?"

The study made use of previously gathered data on the three crop species from across the globe. A group of 17 scientists from four institutions contributed to the current study, but a much larger group of scientists carried out the initial experiments that generated the data. The dataset included 282 inbred lines of corn evaluated in the United States and Puerto Rico; 288 inbred lines of wheat evaluated in Africa, India and Middle Eastern countries; and 433 inbred populations of oats evaluated in the United States and Canada. The data included environmental conditions such as temperature and availability of sunlight. The phenotypic data analyzed in the study included yields, plant height and flowering time, or the window of time during which the plant reaches the reproductive stage.

Advanced data analytics allowed the researchers to develop an environmental index, extracting the major differentiating pattern among the studied natural field conditions. With this explicit environmental dimension defined, how individual genes respond to external signals and collectively lead to the varied final performance of an organism can be systematically evaluated.

"It is like the undiscernible pulses of a plant's perception of the outside conditions now become visible on a monitor screen," said Tingting Guo, a research scientist in agronomy and co-first author of the study.

The study "presents an integrated framework that not only reveals the genetic effect dynamics along an identified environmental index but also enables accurate performance predictions and forecasting," the authors wrote in the paper.

"We are pleased to be able to design such a framework to cover two major research areas, genome-wide association studies and genomic selection (GWAS and GS)," Yu said.

The study found the integrated framework predicted flowering time and plant height accurately, while predictions for yields were more difficult. Li said that's most likely due to how many different environmental parameters, beyond just temperature and sunlight, affect yield at different growth stages. The research team will continue refining its methods to account for more environmental factors in an effort to better predict yields.

Yu and his collaborators first developed their initial data analytics in sorghum but have since expanded their research to include other major global crops. This could help plant scientists design a better plan for finding varieties to test. Yu said applying advanced data analytics to all the available genomic, phenotypic and environmental data can help breeders zero in on varieties they're interested in much faster and more efficiently.

"We believe we have the requisite amount of data to make better predictions about plant performance," Yu said. "Now, we're trying to gain knowledge and wisdom from the data to guide the real-world decision-making process."

###

Funding for the research came from U.S. Department of Agriculture's National Institute of Food and Agriculture, the U.S. Department of Energy Advanced Research Projects Agency-Energy, the National Science Foundation, the ISU Raymond F. Baker Center for Plant Breeding and the ISU Plant Sciences Institute.

 

Space travel weakens our immune systems: Now scientists may know why

Final study by UCSF astronaut points to Treg cells as the culprit

UNIVERSITY OF CALIFORNIA - SAN FRANCISCO

Research News

Microgravity in space perturbs human physiology and is detrimental for astronaut health, a fact first realized during early Apollo missions when astronauts experienced inner ear disturbances, heart arrhythmia, low blood pressure, dehydration, and loss of calcium from their bones after their missions.

One of the most striking observations from Apollo missions was that just over half of astronauts became sick with colds or other infections within a week of returning to Earth. Some astronauts have even experienced re-activation of dormant viruses, such as the chickenpox virus. These findings stimulated studies on the effects of weak gravity, or "microgravity," on the immune system, which scientists have been exploring for decades of manned rockets launches, shuttle travel and space station stints, or sometimes by simulating space gravity in earthbound labs.

In the last study led by one of the first women astronauts, Millie Hughes-Fulford, PhD, researchers at UCSF and Stanford University now have shown that the weakening of an astronaut's immune system during space travel is likely due in part to abnormal activation of immune cells called T regulator cells (Tregs).

Tregs normally are triggered to ramp down immune responses when infection no longer threatens and are important regulators of immune responses in diseases ranging from cancer to COVID-19. In microgravity conditions, however, the researchers found changes in Tregs that prepared them to go to work even before the immune system was challenged. When they stimulated an immune response in human immune cells from blood samples in microgravity, with a chemical often used in research to mimic a disease pathogen, they found that Tregs helped suppress the immune response that was triggered. This unanticipated discovery is published online June 7 in the journal Nature Scientific Reports.

Hughes-Fulford became the first female payload specialist to orbit Earth with her experiments in 1991, and for decades, until her death due to leukemia in February, she studied the effects of microgravity on health, first with an emphasis on osteoporosis and later with a focus on the immune system. As a researcher at the San Francisco Veterans Affairs Medical Center and a UCSF faculty member long affiliated with the Department of Medicine, Hughes-Fulford mentored aspiring space scientists, including the co-principal investigators of this latest immunology study.

Jordan Spatz, PhD, a space scientist and UCSF medical student who became co-PI of the study after Hughes-Fulford's death, noted that as space travel becomes increasingly commercialized and more common, concerns over the health status of space travelers are likely to grow.

"Early in the space program, most astronauts were young and extremely healthy, but now they tend to have much more training and are older," Spatz said. "In addition, apart from astronauts, with the commercialization of space flight there will be many more older and less healthy individuals experiencing microgravity. From a space medical perspective, we see that microgravity does a lot of bad things to the human body, and we are hoping to gain the ability to mitigate some of the effects of microgravity during space travel."

The new study advanced earlier research led by Hughes-Fulford, confirming some of her previous findings from experiments in space and in simulated microgravity, while contributing additional molecular discoveries. Hughes-Fulford earlier had found weaker responses from T lymphocytes of the immune system, some of which attack specific pathogens directly and some of which help orchestrate the immune response.

"It's a double whammy," said co-PI Brice Gaudilliere, MD, PhD, an associate professor in the Department of Anesthesia at Stanford University School of Medicine. "There is a dampening of T lymphocyte immune activation responses, but also an exacerbation of immunosuppressive responses by Tregs." The researchers also found that "natural killer" lymphocytes were less active under simulated microgravity, while antibody-producing B cells appeared to be unaffected.

The researchers simulated microgravity in blood samples with a specialized, cylindrical, cell-culture vessel with motor-driven rotation, a long established microgravity research tool, but the method of single-cell analysis was unique. The scientists identified individual immune cells by specific type and used metal tags and mass spectroscopy to simultaneously detect and quantify dozens of proteins that play a role in immune function, in addition to confirming previously identified patterns of altered gene activation.

###

Funding: The study was funded by the UCSF Department of Biochemistry and Biophysics and the Stanford Department of Anesthesiology, Perioperative and Pain Medicine. No competing interests were declared.

About UCSF: The University of California, San Francisco (UCSF) is exclusively focused on the health sciences and is dedicated to promoting health worldwide through advanced biomedical research, graduate-level education in the life sciences and health professions, and excellence in patient care. UCSF Health, which serves as UCSF's primary academic medical center, includes top-ranked specialty hospitals and other clinical programs, and has affiliations throughout the Bay Area. UCSF School of Medicine also has a regional campus in Fresno. Learn more at ucsf.edu, or see our Fact Sheet.

Follow UCSF
ucsf.edu | Facebook.com/ucsf | YouTube.com/ucsf

 

New insights into survival of ancient Western Desert peoples

UNIVERSITY OF ADELAIDE

Research News

Researchers at the University of Adelaide have used more than two decades of satellite-derived environmental data to form hypotheses about the possible foraging habitats of pre-contact Aboriginal peoples living in Australia's Western Desert.

As one of the most arid and geographically remote regions of Australia, the Western Desert has always presented severe challenges for human survival. Yet despite the harsh conditions, Aboriginal peoples have maintained an enduring presence, continuously adapting to environmental variations through complex socioeconomic strategies.

In the study published in Scientific Reports, the researchers used Earth Observation data to model the most suitable habitats for traditional foraging activities, identifying where surface water was most abundant and vegetation was greenest to infer which areas of the landscape past Aboriginal peoples were likely to have utilised. The study also drew on previous research into traditional subsistence and settlement practices, enabling researchers to estimate daily foraging range in proximity to water.

Lead author of the study, Postdoctoral Researcher Dr Wallace Boone Law, says the fine scale of the satellite model developed enabled the team to depict the highly variable nature of environmental and hence potential foraging habitats in the Western Desert.

"Where earlier studies depicted the Western Desert as a relatively uniform environment, our study shows the region to be highly dynamic and variable, both in its environmental conditions and foraging potential," Dr Law said.

"For example, desert dunefields were once thought to have been a periodic barrier to occupation, but our work shows this is not true for all sandridge deserts. Some dunefield areas offer good foraging habitats, particularly amongst interdunal swale areas.

"However, we also found that there are large, impoverished regions of the Western Desert that would have been extremely challenging for survival, based on terrain ruggedness and access to food and water resources.

"We believe it is likely that some of these poorly-suited foraging areas would have been difficult for survival for the past 21,000 years, and because Aboriginal peoples were highly knowledgeable about the distribution of resources across the Western Desert, we hypothesise those locations would have been rarely used in the past. And further, we predict that the archaeological record of these difficult habitats will point to ephemeral episodes of occupation.

"We suggest that some low-ranked areas of habitat suitability were resource-poor and not economically attractive to foraging activities, even in the best environmental circumstances," said Dr Law.

The researchers hope that archaeologists can use the study to explore many large areas of the Western Desert that have yet to be thoroughly investigated.

"Our findings highlight how future models of forager land use can be integrated with Earth Observation data to better comprehend the environmental complexity and fine scale of resource variability in these vast, remote and diverse places," said Dr Law.

"We hope our research into the changing environment in pre-contact Australia will assist with fostering a new era of research in partnership with Indigenous communities to provide further understanding of the industrious, versatile and resilient Aboriginal peoples of the Western Desert."


CAPTION

Habitat suitability model.

CREDIT

Researchers of paper.


 

From farm to plate: Where do global consumer dollars flow?

CORNELL UNIVERSITY

Research News

IMAGE

IMAGE: WHERE THE CONSUMER DOLLARS GO: FARM VS. POST-FARMGATE FOOD VALUE CHAIN view more 

CREDIT: CORNELL UNIVERSITY

ITHACA, N.Y. - As soon as an ear of corn is taken off its stalk or a potato is pulled from the ground, it travels anywhere from a few miles to across continents and sometimes undergoes processes that transform it into the food we consume.

These miles and processes contribute to what's known as the food value chain (FVC), along which, as one might expect, the value of the product increases. However, most of the research and attention thus far paid to FVCs occurs at the ends of the chain - inside the farm gate and at the consumer's plate.

Less is understood about all of the other links in the FVC, in part due to a lack of standardized data and methods that can be applied universally. Many studies have been done on individual commodities, or in a single country, but coming up with an international method of analyzing FVCs has been elusive until now.

A team of researchers - led by Chris Barrett and Miguel Gómez, Cornell University professors in the Charles H. Dyson School of Applied Economics and Management - has developed the "Global Food Dollar" method, which distributes the consumer's net purchasing dollar across all farm and post-farmgate activities.

"The key insight, from my perspective, is that the overwhelming majority of the value addition is happening after the farm gate," Barrett said. "People fall into thinking of food issues as being farm issues. And farm issues are important, but they're comparatively less important than most people realize. And they're becoming steadily less important over time."

Their methodology expands on the U.S. Department of Agriculture's Economic Research Service (ERS), "food dollar series," published annually since 1947 but updated by Patrick Canning in 2011 to include modern inputs.

"People really don't understand how consumer dollars get apportioned, either between owners of land and intellectual property and workers, or between actors in different stages along the value chain," Barrett said. "And they don't know how that differs across countries. ... How much are you likely to get if you're a Sysco and you're thinking of entering a market to mediate wholesale food delivery?"

"We have lots of data on food production and food consumption," Gómez said, "but not much in between. And it's important, because 80% to 85% of the value is created beyond the farm."

For this research, the team used data collected from 2005-15 from 61 countries, representing 90% of the global economy. They found that farmers receive, on average, 27% of consumer expenditure on foods consumed at home, and a far lower percentage, just 7%, on food consumed away from home (in restaurants, for instance). And as countries' income goes up, the share goes down.

One of the main takeaways, Barrett said, is that more research is needed on all the links in the middle of the value chain.

"The big economic players in the food we consume aren't actually the primary producers on farms," Barrett said. "So when we think about food issues ... maybe we need to spend a little bit more time thinking about what's happening in that post-farmgate value chain, with the processors, manufacturers, wholesalers, retailers and restaurants."

###

"The Overlooked Magnitude of Post-Farmgate Food Value Chains," published in Nature Food, was written, and the method developed, as part of a partnership with the ERS, led by Canning, a co-author on the study.

 

Do customer loyalty programs really help sellers make money?

New study finds that yes, they do, but not in the ways you may think

INSTITUTE FOR OPERATIONS RESEARCH AND THE MANAGEMENT SCIENCES

Research News

Key Takeaways:

  • Study finds non-tiered customer loyalty programs create a more sustainable customer base.
  • Non-tiered customer loyalty programs are not as likely to generate increases in spending per transaction or accelerate transactions.

CATONSVILLE, MD, June 7, 2021 - Customer loyalty programs have been around for decades and are used to help businesses, marketers and sellers build a sustainable relationship with their customers. But do they work? A recent study sought to find out and researchers learned that while yes, customer loyalty programs do work, perhaps not in ways most may assume.

There are two basic types of customer loyalty programs, tiered and non-tiered. Airlines and hotels often use tiered customer loyalty programs that increase rewards as program members reach higher thresholds of spending over time. Retailers and service industry businesses are more likely to offer non-tiered customer loyalty programs, in which members are rewarded with frequent, but not increasing rewards, such as "buy 10 get one free."

This research investigated if those non-tiered customer loyalty programs actually do what they are designed to do.

The study to be published in the June issue of the INFORMS journal Marketing Science, "Can Non-tiered Customer Loyalty Programs Be Profitable?", is authored by Arun Gopalakrishnan of Rice University, Zhenling Jiang of the Wharton School of Business at the University of Pennsylvania, and Yulia Nevskaya and Raphael Thomadsen of the Olin Business School at Washington University in St. Louis.

The authors found that non-tiered customer loyalty programs increase customer value by almost 30% over a five-year time period. They discovered that the program's effectiveness is not so much through increased spending per transaction or frequency of purchasing but rather through the reduction of attrition. In other words, the chief benefit is that the customer loyalty program reduces customer fall-off and turnover.

"We found that a non-tiered customer loyalty program's reduction in attrition accounts for more than 80% of the program's total lift or success," said Thomadsen. "On the other hand, increased frequency accounts for less than 20% of the program's lift or effectiveness."

Jiang added, "One of the more interesting findings was that the impact of the loyalty program does not necessarily contribute to increased spending per transaction or increased frequency of transactions. Rather, the benefit to the business is creating more sustainable and lasting relationships with customers."

To conduct their research, the authors worked with a company to collect data of more than 5,500 new customers who first started purchasing from that company in the same three-month period. This helped to ensure that the customers were comparable in terms of the amount of time they had to become acquainted with the selling firm. For the next 30 months, the researchers collected all subsequent transaction data from those consumers. During that period, a non-tiered customer loyalty program was introduced.

In the process, some of these new customers were automatically enrolled into the loyalty program. This helped researchers better gauge pre-program visit frequency and spending and then compare it to post-enrollment visit frequency and spending. "We were able to analyze the behaviors of consumers absent a customer loyalty program, and then after the rollout of the program," said Nevskaya. "We evaluated frequency and actual spending amounts, and whether customers come back for repeat transactions."

Gopalakrishnan summarized, "In the end, the primary value of a non-tiered customer loyalty program is not a means to increase frequency or spending. It's a way to nurture a long-term and lasting relationship with the customer to reduce the defection of loyal customers over time. Non-tiered loyalty programs may provide psychological benefits that help cultivate such loyalty."

###

About INFORMS and Marketing Science

Marketing Science is a premier peer-reviewed scholarly marketing journal focused on research using quantitative approaches to study all aspects of the interface between consumers and firms. It is published by INFORMS, the leading international association for operations research and analytics professionals. More information is available at http://www.informs.org or @informs.

 

Feedback on cafeteria purchases helps employees make healthier food choices

A randomized clinical trial tested an automated, personalized intervention to improve health in hospital staff

MASSACHUSETTS GENERAL HOSPITAL

Research News

BOSTON - Automated emails and letters that provide personalized feedback related to cafeteria purchases at work may help employees make healthier food choices. That's the conclusion of a new study that was led by investigators at Massachusetts General Hospital (MGH) and is published in JAMA Network Open.

As many adults spend half (and sometimes more) of their waking hours working, the workplace provides a unique opportunity to promote health with programs that target obesity, unhealthy diets, and other risk factors for chronic diseases and premature death.

Building on findings from previous studies, researchers designed the ChooseWell 365 clinical trial to test a 12-month automated, personalized behavioral intervention to prevent weight gain and improve diet in hospital employees. For the trial, 602 MGH employees who regularly used the hospital's cafeterias were randomized to an intervention group or a control group. For one year, participants in the intervention group received two emails per week that included feedback on their previous cafeteria purchases and offered personalized health and lifestyle tips. They also received one letter per month with comparisons of their purchases with those of their peers, as well as financial incentives for healthier purchases. Control participants received one letter per month with general healthy lifestyle information.

"This novel workplace strategy was completely automated and did not require that people take time away from work to participate, making it ideal for busy hospital employees," explains lead author Anne N. Thorndike, MD, MPH, an investigator in the Division of General Internal Medicine at MGH and an associate professor of Medicine at Harvard Medical School.

Participants in the intervention group increased their healthy cafeteria food purchases to a greater extent than participants in the control group. They also purchased fewer calories per day. These differences were observed during the one-year intervention as well as during a year of additional assessments. There were no differences between the groups in terms of weight change at 12 or 24 months, however.

"Few if any prior workplace studies have been able to make sustained changes in dietary choices of employees," says Thorndike. "This study provides evidence that food purchasing data can be leveraged for delivering health promotion interventions at scale."

###

Co-authors include Jessica L. McCurley, PhD, Emily D. Gelsomin, MLA, RD, LDN, Emma Anderson, BA, Yuchiao Chang, PhD, Bianca Porneala, MS, Charles Johnson, BA, and Douglas E. Levy, PhD, all of MGH, and Eric B. Rimm, ScD, of the Harvard T.H. Chan School of Public Health.

This work was supported by National Institutes of Health.

About the Massachusetts General Hospital

Massachusetts General Hospital, founded in 1811, is the original and largest teaching hospital of Harvard Medical School. The Mass General Research Institute conducts the largest hospital-based research program in the nation, with annual research operations of more than $1 billion and comprises more than 9,500 researchers working across more than 30 institutes, centers and departments. In August 2020, Mass General was named #6 in the U.S. News & World Report list of "America's Best Hospitals."

 

Drop in convalescent plasma use at US hospitals linked to higher COVID-19 mortality rate

Analysis suggests decline in convalescent plasma use in US hospitals from November 2020 to February 2021 may have led to as many as 29,000 excess COVID-19 deaths

JOHNS HOPKINS UNIVERSITY BLOOMBERG SCHOOL OF PUBLIC HEALTH

Research News

A new study from researchers at Johns Hopkins Bloomberg School of Public Health and colleagues suggests a slowdown in the use of convalescent plasma to treat hospitalized COVID-19 patients led to a higher COVID-19 mortality during a critical period during this past winter's surge.

U.S. hospitals began treating COVID-19 patients with convalescent plasma therapy--which uses antibody-rich blood from recovered COVID-19 patients--in the summer of 2020 when doctors were looking to identify treatments for the emerging disease. By the spring of 2021, doctors in the United States had treated over 500,000 COVID-19 patients with convalescent plasma. The use of convalescent plasma started declining late in 2020 after several large clinical trials showed no apparent benefit.

The researchers' analysis suggests that the decline in convalescent plasma use might have led to more than 29,000 excess COVID-19 deaths from November 2020 to February 2021.

The study was published online June 4 in the journal eLife.

"Clinical trials of convalescent plasma use in COVID-19 have had mixed results, but other studies, including this one, have been consistent with the idea that it does reduce mortality," says study senior author Arturo Casadevall, MD, PhD, Alfred and Jill Sommer Professor and Chair of the Department of the Molecular Microbiology and Immunology at the Bloomberg School.

The study was done in collaboration with researchers at Michigan State University and the Mayo Clinic. Casadevall and colleagues observed that while plasma use was declining late last year, the reported COVID-19 patient mortality rate was rising. That led them to hypothesize that the two phenomena were related.

In the study, the researchers compared the number of units of plasma distributed to U.S. hospitals from blood banks, on a per patient basis, to the number of reported COVID-19 deaths per hospital admission across the country.

One finding was that while the total use of plasma peaked last December and January during the winter surge in new COVID-19 patients, the use per hospitalized patient peaked in early October 2020--just as deaths per COVID-19 hospital admission bottomed. Thereafter, in the wake of reports of negative results from clinical trials, use of plasma per hospitalized patient fell sharply--and deaths per COVID-19 hospital admission rose.

The researchers analyzed the relationship between these two datasets and found a strong negative correlation, higher use rate being associated with lower mortality and vice versa. They also grouped periods of plasma use into five "quintile" groupings from lowest-use weeks to highest, and found a graded relationship between less use and higher mortality.

A model the researchers generated to fit the data suggested that the COVID-19 case fatality rate decreased by 1.8 percentage points for every 10-percentage point increase in the rate of plasma use. That model implied that there would have been 29,018 fewer deaths, from November 2020 to February 2021, if the peak use rate of early October had held. Moreover, it suggested that the use of plasma on the whole, as limited as it was, prevented about 95,000 deaths through early March of this year.

The researchers analyzed, and then rejected, the possibility that several other factors could explain away the link between less plasma use and more mortality. These factors included changes in the average age of hospitalized patients, and the emergence of new variants of the COVID-19-causing coronavirus.

As for why some clinical trials found no benefit for plasma use, the researchers note in their paper that many of the clinical trials with negative results had used plasma--mainly considered an antiviral treatment--relatively late in the course of COVID-19, when patients may have been too ill to benefit, and when the disease is driven mainly by immune-related responses rather than the coronavirus itself.

Casadevall notes that convalescent plasma remains under FDA Emergent Use Authorization in the U.S., and that it is readily available. "We hope that physicians, policymakers, and regulators will consider the totality of the available evidence, including our findings, when making decisions about convalescent plasma use in individual COVID-19 patients," Casadevall says.

###

"Convalescent Plasma Use in the United States was inversely correlated with COVID-19 Mortality" was co-authored by Arturo Casadevall, Quigly Dragotakes, Patrick Johnson, Jonathon Senefeld, Stephen Klassen, R. Scott Wright, Michael Joyner, Nigel Paneth, and Rickey Carter.

There was no specific funding for this study. Individual authors have been supported by the National Institutes of Health (RO1 HL059842; R01 AI152078 9; 5R35HL139854).This project has been funded in whole or in part by the Department of Health and Human Services; Office of the Assistant Secretary for Preparedness and Response; Biomedical Advanced Research and Development Authority under Contract No. 75A50120C00096.

 

Largest-ever pre-adolescent brain activation study reveals cognitive function maps

Data from largest study of its kind will clarify risk factors for mental health challenges

LARNER COLLEGE OF MEDICINE AT THE UNIVERSITY OF VERMONT

Research News

Youth brain activation data from the largest longitudinal neuroimaging study to date provides valuable new information on the cognitive processes and brain systems that underlie adolescent development and might contribute to mental and physical health challenges in adulthood. The study published today online in Nature Neuroscience.

Because of the notable brain, cognitive, and emotional maturation - and emergence of many mental health disorders - that occurs between the ages of 10 and 20, understanding neurodevelopment and how it is impacted by the numerous risk factors that emerge during this timeframe is a critical area of interest. However, to date, most human neuroimaging studies have focused on adult functioning.

The Adolescent Brain Cognitive Development Study (ABCD) Study, which launched in 2016, is a multisite, 10-year-long longitudinal study that has enrolled nearly 12,000 youth aged 9 to 10 at 21 research sites around the country.

These latest findings demonstrate which brain regions are involved in a range of important psychological processes, including cognitive control, reward processing, working memory, and social/emotional function.

Using functional magnetic resonance imaging (fMRI) technology, the researchers observed brain activation during a battery of three different tasks and identified how differences in the patterns of activity related to individual differences in these processes.

"This study - likely the biggest task activation paper ever - shows the brain regions activated by each task, how well they capture individual differences, and will likely serve as a baseline for all the subsequent papers that will track the kids as they age," says Hugh Garavan, Ph.D., professor of psychiatry at the University of Vermont, and a senior author on the study.

The brain maps aim to improve scientists' understanding of the psychological processes that put young people at higher risk for developing mental and physical health challenges and, by identifying the brain correlates of factors that influence development, can give guidance on which interventions could help improve outcomes.

"These brain activation maps and spatial reproducibility findings will serve as a gold standard for the neuroscientific community and could help inform study design," says Bader Chaarani, Ph.D., assistant professor of psychiatry at the University of Vermont and the study's first author.

The study's authors state that these brain activation maps will allow for "cross-sectional analyses of inter-individual and group differences," as well as "offer the potential for examining baseline predictors of future development and behavior and for quantifying changes in brain function that may arise from the numerous influences expected to affect development and behavior."

###

Most of the findings are available online at ABCDStudy.org for all scientists to access and use in their research.

NOTE: Data used in the preparation of this article were obtained from the Adolescent Brain Cognitive DevelopmentSM Study, held in the National Institute for Mental Health Data Archive. ABCD consortium investigators designed and implemented the study and/or provided data but did not necessarily participate in analysis or writing of this report. The views expressed in this manuscript are those of the authors and do not necessarily reflect the official views of the National Institutes of Health, the Department of Health and Human Services, the United States federal government, or ABCD consortium investigators.