Tuesday, June 08, 2021

 

Sensing what plants sense: Integrated framework helps scientists explain biology and predict crop performance

IOWA STATE UNIVERSITY

Research News

AMES, Iowa - Scientists have invested great time and effort into making connections between a plant's genotype, or its genetic makeup, and its phenotype, or the plant's observable traits. Understanding a plant's genome helps plant biologists predict how that plant will perform in the real world, which can be useful for breeding crop varieties that will produce high yields or resist stress.

But environmental conditions play a role as well. Plants with the same genotype will perform differently when grown in different environments. A new study led by an Iowa State University scientist uses advanced data analytics to help scientists understand how the environment interacts with genomics in corn, wheat and oats. The results could lead to more accurate and faster models that will allow plant breeders to develop crop varieties with desirable traits.

The study was published recently in the peer-reviewed academic journal Molecular Plant.

Jianming Yu, a professor of agronomy and the Pioneer Distinguished Chair in Maize Breeding, said the study sheds light on phenotypic plasticity, or the ability of crops to adapt to environmental changes. This could help plant breeders get a better understanding of how "shapable" plant species are, or how much potential they have to perform well in different environments.

"We knew that genetic performance is context dependent. It's not static; It's dependent on environmental conditions," said Xianran Li, an adjunct associate professor and the first author of the study. "Two alleles of a gene perform differently in one environment but the same in another. What is challenging is to understand the interplay between genes and the environment under the natural field conditions. The obvious obstacle is that natural environments are much more complex than controlled laboratory conditions. How can we detect the major signals plants perceive?"

The study made use of previously gathered data on the three crop species from across the globe. A group of 17 scientists from four institutions contributed to the current study, but a much larger group of scientists carried out the initial experiments that generated the data. The dataset included 282 inbred lines of corn evaluated in the United States and Puerto Rico; 288 inbred lines of wheat evaluated in Africa, India and Middle Eastern countries; and 433 inbred populations of oats evaluated in the United States and Canada. The data included environmental conditions such as temperature and availability of sunlight. The phenotypic data analyzed in the study included yields, plant height and flowering time, or the window of time during which the plant reaches the reproductive stage.

Advanced data analytics allowed the researchers to develop an environmental index, extracting the major differentiating pattern among the studied natural field conditions. With this explicit environmental dimension defined, how individual genes respond to external signals and collectively lead to the varied final performance of an organism can be systematically evaluated.

"It is like the undiscernible pulses of a plant's perception of the outside conditions now become visible on a monitor screen," said Tingting Guo, a research scientist in agronomy and co-first author of the study.

The study "presents an integrated framework that not only reveals the genetic effect dynamics along an identified environmental index but also enables accurate performance predictions and forecasting," the authors wrote in the paper.

"We are pleased to be able to design such a framework to cover two major research areas, genome-wide association studies and genomic selection (GWAS and GS)," Yu said.

The study found the integrated framework predicted flowering time and plant height accurately, while predictions for yields were more difficult. Li said that's most likely due to how many different environmental parameters, beyond just temperature and sunlight, affect yield at different growth stages. The research team will continue refining its methods to account for more environmental factors in an effort to better predict yields.

Yu and his collaborators first developed their initial data analytics in sorghum but have since expanded their research to include other major global crops. This could help plant scientists design a better plan for finding varieties to test. Yu said applying advanced data analytics to all the available genomic, phenotypic and environmental data can help breeders zero in on varieties they're interested in much faster and more efficiently.

"We believe we have the requisite amount of data to make better predictions about plant performance," Yu said. "Now, we're trying to gain knowledge and wisdom from the data to guide the real-world decision-making process."

###

Funding for the research came from U.S. Department of Agriculture's National Institute of Food and Agriculture, the U.S. Department of Energy Advanced Research Projects Agency-Energy, the National Science Foundation, the ISU Raymond F. Baker Center for Plant Breeding and the ISU Plant Sciences Institute.

 

Space travel weakens our immune systems: Now scientists may know why

Final study by UCSF astronaut points to Treg cells as the culprit

UNIVERSITY OF CALIFORNIA - SAN FRANCISCO

Research News

Microgravity in space perturbs human physiology and is detrimental for astronaut health, a fact first realized during early Apollo missions when astronauts experienced inner ear disturbances, heart arrhythmia, low blood pressure, dehydration, and loss of calcium from their bones after their missions.

One of the most striking observations from Apollo missions was that just over half of astronauts became sick with colds or other infections within a week of returning to Earth. Some astronauts have even experienced re-activation of dormant viruses, such as the chickenpox virus. These findings stimulated studies on the effects of weak gravity, or "microgravity," on the immune system, which scientists have been exploring for decades of manned rockets launches, shuttle travel and space station stints, or sometimes by simulating space gravity in earthbound labs.

In the last study led by one of the first women astronauts, Millie Hughes-Fulford, PhD, researchers at UCSF and Stanford University now have shown that the weakening of an astronaut's immune system during space travel is likely due in part to abnormal activation of immune cells called T regulator cells (Tregs).

Tregs normally are triggered to ramp down immune responses when infection no longer threatens and are important regulators of immune responses in diseases ranging from cancer to COVID-19. In microgravity conditions, however, the researchers found changes in Tregs that prepared them to go to work even before the immune system was challenged. When they stimulated an immune response in human immune cells from blood samples in microgravity, with a chemical often used in research to mimic a disease pathogen, they found that Tregs helped suppress the immune response that was triggered. This unanticipated discovery is published online June 7 in the journal Nature Scientific Reports.

Hughes-Fulford became the first female payload specialist to orbit Earth with her experiments in 1991, and for decades, until her death due to leukemia in February, she studied the effects of microgravity on health, first with an emphasis on osteoporosis and later with a focus on the immune system. As a researcher at the San Francisco Veterans Affairs Medical Center and a UCSF faculty member long affiliated with the Department of Medicine, Hughes-Fulford mentored aspiring space scientists, including the co-principal investigators of this latest immunology study.

Jordan Spatz, PhD, a space scientist and UCSF medical student who became co-PI of the study after Hughes-Fulford's death, noted that as space travel becomes increasingly commercialized and more common, concerns over the health status of space travelers are likely to grow.

"Early in the space program, most astronauts were young and extremely healthy, but now they tend to have much more training and are older," Spatz said. "In addition, apart from astronauts, with the commercialization of space flight there will be many more older and less healthy individuals experiencing microgravity. From a space medical perspective, we see that microgravity does a lot of bad things to the human body, and we are hoping to gain the ability to mitigate some of the effects of microgravity during space travel."

The new study advanced earlier research led by Hughes-Fulford, confirming some of her previous findings from experiments in space and in simulated microgravity, while contributing additional molecular discoveries. Hughes-Fulford earlier had found weaker responses from T lymphocytes of the immune system, some of which attack specific pathogens directly and some of which help orchestrate the immune response.

"It's a double whammy," said co-PI Brice Gaudilliere, MD, PhD, an associate professor in the Department of Anesthesia at Stanford University School of Medicine. "There is a dampening of T lymphocyte immune activation responses, but also an exacerbation of immunosuppressive responses by Tregs." The researchers also found that "natural killer" lymphocytes were less active under simulated microgravity, while antibody-producing B cells appeared to be unaffected.

The researchers simulated microgravity in blood samples with a specialized, cylindrical, cell-culture vessel with motor-driven rotation, a long established microgravity research tool, but the method of single-cell analysis was unique. The scientists identified individual immune cells by specific type and used metal tags and mass spectroscopy to simultaneously detect and quantify dozens of proteins that play a role in immune function, in addition to confirming previously identified patterns of altered gene activation.

###

Funding: The study was funded by the UCSF Department of Biochemistry and Biophysics and the Stanford Department of Anesthesiology, Perioperative and Pain Medicine. No competing interests were declared.

About UCSF: The University of California, San Francisco (UCSF) is exclusively focused on the health sciences and is dedicated to promoting health worldwide through advanced biomedical research, graduate-level education in the life sciences and health professions, and excellence in patient care. UCSF Health, which serves as UCSF's primary academic medical center, includes top-ranked specialty hospitals and other clinical programs, and has affiliations throughout the Bay Area. UCSF School of Medicine also has a regional campus in Fresno. Learn more at ucsf.edu, or see our Fact Sheet.

Follow UCSF
ucsf.edu | Facebook.com/ucsf | YouTube.com/ucsf

 

New insights into survival of ancient Western Desert peoples

UNIVERSITY OF ADELAIDE

Research News

Researchers at the University of Adelaide have used more than two decades of satellite-derived environmental data to form hypotheses about the possible foraging habitats of pre-contact Aboriginal peoples living in Australia's Western Desert.

As one of the most arid and geographically remote regions of Australia, the Western Desert has always presented severe challenges for human survival. Yet despite the harsh conditions, Aboriginal peoples have maintained an enduring presence, continuously adapting to environmental variations through complex socioeconomic strategies.

In the study published in Scientific Reports, the researchers used Earth Observation data to model the most suitable habitats for traditional foraging activities, identifying where surface water was most abundant and vegetation was greenest to infer which areas of the landscape past Aboriginal peoples were likely to have utilised. The study also drew on previous research into traditional subsistence and settlement practices, enabling researchers to estimate daily foraging range in proximity to water.

Lead author of the study, Postdoctoral Researcher Dr Wallace Boone Law, says the fine scale of the satellite model developed enabled the team to depict the highly variable nature of environmental and hence potential foraging habitats in the Western Desert.

"Where earlier studies depicted the Western Desert as a relatively uniform environment, our study shows the region to be highly dynamic and variable, both in its environmental conditions and foraging potential," Dr Law said.

"For example, desert dunefields were once thought to have been a periodic barrier to occupation, but our work shows this is not true for all sandridge deserts. Some dunefield areas offer good foraging habitats, particularly amongst interdunal swale areas.

"However, we also found that there are large, impoverished regions of the Western Desert that would have been extremely challenging for survival, based on terrain ruggedness and access to food and water resources.

"We believe it is likely that some of these poorly-suited foraging areas would have been difficult for survival for the past 21,000 years, and because Aboriginal peoples were highly knowledgeable about the distribution of resources across the Western Desert, we hypothesise those locations would have been rarely used in the past. And further, we predict that the archaeological record of these difficult habitats will point to ephemeral episodes of occupation.

"We suggest that some low-ranked areas of habitat suitability were resource-poor and not economically attractive to foraging activities, even in the best environmental circumstances," said Dr Law.

The researchers hope that archaeologists can use the study to explore many large areas of the Western Desert that have yet to be thoroughly investigated.

"Our findings highlight how future models of forager land use can be integrated with Earth Observation data to better comprehend the environmental complexity and fine scale of resource variability in these vast, remote and diverse places," said Dr Law.

"We hope our research into the changing environment in pre-contact Australia will assist with fostering a new era of research in partnership with Indigenous communities to provide further understanding of the industrious, versatile and resilient Aboriginal peoples of the Western Desert."


CAPTION

Habitat suitability model.

CREDIT

Researchers of paper.


 

From farm to plate: Where do global consumer dollars flow?

CORNELL UNIVERSITY

Research News

IMAGE

IMAGE: WHERE THE CONSUMER DOLLARS GO: FARM VS. POST-FARMGATE FOOD VALUE CHAIN view more 

CREDIT: CORNELL UNIVERSITY

ITHACA, N.Y. - As soon as an ear of corn is taken off its stalk or a potato is pulled from the ground, it travels anywhere from a few miles to across continents and sometimes undergoes processes that transform it into the food we consume.

These miles and processes contribute to what's known as the food value chain (FVC), along which, as one might expect, the value of the product increases. However, most of the research and attention thus far paid to FVCs occurs at the ends of the chain - inside the farm gate and at the consumer's plate.

Less is understood about all of the other links in the FVC, in part due to a lack of standardized data and methods that can be applied universally. Many studies have been done on individual commodities, or in a single country, but coming up with an international method of analyzing FVCs has been elusive until now.

A team of researchers - led by Chris Barrett and Miguel Gómez, Cornell University professors in the Charles H. Dyson School of Applied Economics and Management - has developed the "Global Food Dollar" method, which distributes the consumer's net purchasing dollar across all farm and post-farmgate activities.

"The key insight, from my perspective, is that the overwhelming majority of the value addition is happening after the farm gate," Barrett said. "People fall into thinking of food issues as being farm issues. And farm issues are important, but they're comparatively less important than most people realize. And they're becoming steadily less important over time."

Their methodology expands on the U.S. Department of Agriculture's Economic Research Service (ERS), "food dollar series," published annually since 1947 but updated by Patrick Canning in 2011 to include modern inputs.

"People really don't understand how consumer dollars get apportioned, either between owners of land and intellectual property and workers, or between actors in different stages along the value chain," Barrett said. "And they don't know how that differs across countries. ... How much are you likely to get if you're a Sysco and you're thinking of entering a market to mediate wholesale food delivery?"

"We have lots of data on food production and food consumption," Gómez said, "but not much in between. And it's important, because 80% to 85% of the value is created beyond the farm."

For this research, the team used data collected from 2005-15 from 61 countries, representing 90% of the global economy. They found that farmers receive, on average, 27% of consumer expenditure on foods consumed at home, and a far lower percentage, just 7%, on food consumed away from home (in restaurants, for instance). And as countries' income goes up, the share goes down.

One of the main takeaways, Barrett said, is that more research is needed on all the links in the middle of the value chain.

"The big economic players in the food we consume aren't actually the primary producers on farms," Barrett said. "So when we think about food issues ... maybe we need to spend a little bit more time thinking about what's happening in that post-farmgate value chain, with the processors, manufacturers, wholesalers, retailers and restaurants."

###

"The Overlooked Magnitude of Post-Farmgate Food Value Chains," published in Nature Food, was written, and the method developed, as part of a partnership with the ERS, led by Canning, a co-author on the study.

 

Do customer loyalty programs really help sellers make money?

New study finds that yes, they do, but not in the ways you may think

INSTITUTE FOR OPERATIONS RESEARCH AND THE MANAGEMENT SCIENCES

Research News

Key Takeaways:

  • Study finds non-tiered customer loyalty programs create a more sustainable customer base.
  • Non-tiered customer loyalty programs are not as likely to generate increases in spending per transaction or accelerate transactions.

CATONSVILLE, MD, June 7, 2021 - Customer loyalty programs have been around for decades and are used to help businesses, marketers and sellers build a sustainable relationship with their customers. But do they work? A recent study sought to find out and researchers learned that while yes, customer loyalty programs do work, perhaps not in ways most may assume.

There are two basic types of customer loyalty programs, tiered and non-tiered. Airlines and hotels often use tiered customer loyalty programs that increase rewards as program members reach higher thresholds of spending over time. Retailers and service industry businesses are more likely to offer non-tiered customer loyalty programs, in which members are rewarded with frequent, but not increasing rewards, such as "buy 10 get one free."

This research investigated if those non-tiered customer loyalty programs actually do what they are designed to do.

The study to be published in the June issue of the INFORMS journal Marketing Science, "Can Non-tiered Customer Loyalty Programs Be Profitable?", is authored by Arun Gopalakrishnan of Rice University, Zhenling Jiang of the Wharton School of Business at the University of Pennsylvania, and Yulia Nevskaya and Raphael Thomadsen of the Olin Business School at Washington University in St. Louis.

The authors found that non-tiered customer loyalty programs increase customer value by almost 30% over a five-year time period. They discovered that the program's effectiveness is not so much through increased spending per transaction or frequency of purchasing but rather through the reduction of attrition. In other words, the chief benefit is that the customer loyalty program reduces customer fall-off and turnover.

"We found that a non-tiered customer loyalty program's reduction in attrition accounts for more than 80% of the program's total lift or success," said Thomadsen. "On the other hand, increased frequency accounts for less than 20% of the program's lift or effectiveness."

Jiang added, "One of the more interesting findings was that the impact of the loyalty program does not necessarily contribute to increased spending per transaction or increased frequency of transactions. Rather, the benefit to the business is creating more sustainable and lasting relationships with customers."

To conduct their research, the authors worked with a company to collect data of more than 5,500 new customers who first started purchasing from that company in the same three-month period. This helped to ensure that the customers were comparable in terms of the amount of time they had to become acquainted with the selling firm. For the next 30 months, the researchers collected all subsequent transaction data from those consumers. During that period, a non-tiered customer loyalty program was introduced.

In the process, some of these new customers were automatically enrolled into the loyalty program. This helped researchers better gauge pre-program visit frequency and spending and then compare it to post-enrollment visit frequency and spending. "We were able to analyze the behaviors of consumers absent a customer loyalty program, and then after the rollout of the program," said Nevskaya. "We evaluated frequency and actual spending amounts, and whether customers come back for repeat transactions."

Gopalakrishnan summarized, "In the end, the primary value of a non-tiered customer loyalty program is not a means to increase frequency or spending. It's a way to nurture a long-term and lasting relationship with the customer to reduce the defection of loyal customers over time. Non-tiered loyalty programs may provide psychological benefits that help cultivate such loyalty."

###

About INFORMS and Marketing Science

Marketing Science is a premier peer-reviewed scholarly marketing journal focused on research using quantitative approaches to study all aspects of the interface between consumers and firms. It is published by INFORMS, the leading international association for operations research and analytics professionals. More information is available at http://www.informs.org or @informs.

 

Feedback on cafeteria purchases helps employees make healthier food choices

A randomized clinical trial tested an automated, personalized intervention to improve health in hospital staff

MASSACHUSETTS GENERAL HOSPITAL

Research News

BOSTON - Automated emails and letters that provide personalized feedback related to cafeteria purchases at work may help employees make healthier food choices. That's the conclusion of a new study that was led by investigators at Massachusetts General Hospital (MGH) and is published in JAMA Network Open.

As many adults spend half (and sometimes more) of their waking hours working, the workplace provides a unique opportunity to promote health with programs that target obesity, unhealthy diets, and other risk factors for chronic diseases and premature death.

Building on findings from previous studies, researchers designed the ChooseWell 365 clinical trial to test a 12-month automated, personalized behavioral intervention to prevent weight gain and improve diet in hospital employees. For the trial, 602 MGH employees who regularly used the hospital's cafeterias were randomized to an intervention group or a control group. For one year, participants in the intervention group received two emails per week that included feedback on their previous cafeteria purchases and offered personalized health and lifestyle tips. They also received one letter per month with comparisons of their purchases with those of their peers, as well as financial incentives for healthier purchases. Control participants received one letter per month with general healthy lifestyle information.

"This novel workplace strategy was completely automated and did not require that people take time away from work to participate, making it ideal for busy hospital employees," explains lead author Anne N. Thorndike, MD, MPH, an investigator in the Division of General Internal Medicine at MGH and an associate professor of Medicine at Harvard Medical School.

Participants in the intervention group increased their healthy cafeteria food purchases to a greater extent than participants in the control group. They also purchased fewer calories per day. These differences were observed during the one-year intervention as well as during a year of additional assessments. There were no differences between the groups in terms of weight change at 12 or 24 months, however.

"Few if any prior workplace studies have been able to make sustained changes in dietary choices of employees," says Thorndike. "This study provides evidence that food purchasing data can be leveraged for delivering health promotion interventions at scale."

###

Co-authors include Jessica L. McCurley, PhD, Emily D. Gelsomin, MLA, RD, LDN, Emma Anderson, BA, Yuchiao Chang, PhD, Bianca Porneala, MS, Charles Johnson, BA, and Douglas E. Levy, PhD, all of MGH, and Eric B. Rimm, ScD, of the Harvard T.H. Chan School of Public Health.

This work was supported by National Institutes of Health.

About the Massachusetts General Hospital

Massachusetts General Hospital, founded in 1811, is the original and largest teaching hospital of Harvard Medical School. The Mass General Research Institute conducts the largest hospital-based research program in the nation, with annual research operations of more than $1 billion and comprises more than 9,500 researchers working across more than 30 institutes, centers and departments. In August 2020, Mass General was named #6 in the U.S. News & World Report list of "America's Best Hospitals."

 

Drop in convalescent plasma use at US hospitals linked to higher COVID-19 mortality rate

Analysis suggests decline in convalescent plasma use in US hospitals from November 2020 to February 2021 may have led to as many as 29,000 excess COVID-19 deaths

JOHNS HOPKINS UNIVERSITY BLOOMBERG SCHOOL OF PUBLIC HEALTH

Research News

A new study from researchers at Johns Hopkins Bloomberg School of Public Health and colleagues suggests a slowdown in the use of convalescent plasma to treat hospitalized COVID-19 patients led to a higher COVID-19 mortality during a critical period during this past winter's surge.

U.S. hospitals began treating COVID-19 patients with convalescent plasma therapy--which uses antibody-rich blood from recovered COVID-19 patients--in the summer of 2020 when doctors were looking to identify treatments for the emerging disease. By the spring of 2021, doctors in the United States had treated over 500,000 COVID-19 patients with convalescent plasma. The use of convalescent plasma started declining late in 2020 after several large clinical trials showed no apparent benefit.

The researchers' analysis suggests that the decline in convalescent plasma use might have led to more than 29,000 excess COVID-19 deaths from November 2020 to February 2021.

The study was published online June 4 in the journal eLife.

"Clinical trials of convalescent plasma use in COVID-19 have had mixed results, but other studies, including this one, have been consistent with the idea that it does reduce mortality," says study senior author Arturo Casadevall, MD, PhD, Alfred and Jill Sommer Professor and Chair of the Department of the Molecular Microbiology and Immunology at the Bloomberg School.

The study was done in collaboration with researchers at Michigan State University and the Mayo Clinic. Casadevall and colleagues observed that while plasma use was declining late last year, the reported COVID-19 patient mortality rate was rising. That led them to hypothesize that the two phenomena were related.

In the study, the researchers compared the number of units of plasma distributed to U.S. hospitals from blood banks, on a per patient basis, to the number of reported COVID-19 deaths per hospital admission across the country.

One finding was that while the total use of plasma peaked last December and January during the winter surge in new COVID-19 patients, the use per hospitalized patient peaked in early October 2020--just as deaths per COVID-19 hospital admission bottomed. Thereafter, in the wake of reports of negative results from clinical trials, use of plasma per hospitalized patient fell sharply--and deaths per COVID-19 hospital admission rose.

The researchers analyzed the relationship between these two datasets and found a strong negative correlation, higher use rate being associated with lower mortality and vice versa. They also grouped periods of plasma use into five "quintile" groupings from lowest-use weeks to highest, and found a graded relationship between less use and higher mortality.

A model the researchers generated to fit the data suggested that the COVID-19 case fatality rate decreased by 1.8 percentage points for every 10-percentage point increase in the rate of plasma use. That model implied that there would have been 29,018 fewer deaths, from November 2020 to February 2021, if the peak use rate of early October had held. Moreover, it suggested that the use of plasma on the whole, as limited as it was, prevented about 95,000 deaths through early March of this year.

The researchers analyzed, and then rejected, the possibility that several other factors could explain away the link between less plasma use and more mortality. These factors included changes in the average age of hospitalized patients, and the emergence of new variants of the COVID-19-causing coronavirus.

As for why some clinical trials found no benefit for plasma use, the researchers note in their paper that many of the clinical trials with negative results had used plasma--mainly considered an antiviral treatment--relatively late in the course of COVID-19, when patients may have been too ill to benefit, and when the disease is driven mainly by immune-related responses rather than the coronavirus itself.

Casadevall notes that convalescent plasma remains under FDA Emergent Use Authorization in the U.S., and that it is readily available. "We hope that physicians, policymakers, and regulators will consider the totality of the available evidence, including our findings, when making decisions about convalescent plasma use in individual COVID-19 patients," Casadevall says.

###

"Convalescent Plasma Use in the United States was inversely correlated with COVID-19 Mortality" was co-authored by Arturo Casadevall, Quigly Dragotakes, Patrick Johnson, Jonathon Senefeld, Stephen Klassen, R. Scott Wright, Michael Joyner, Nigel Paneth, and Rickey Carter.

There was no specific funding for this study. Individual authors have been supported by the National Institutes of Health (RO1 HL059842; R01 AI152078 9; 5R35HL139854).This project has been funded in whole or in part by the Department of Health and Human Services; Office of the Assistant Secretary for Preparedness and Response; Biomedical Advanced Research and Development Authority under Contract No. 75A50120C00096.

 

Largest-ever pre-adolescent brain activation study reveals cognitive function maps

Data from largest study of its kind will clarify risk factors for mental health challenges

LARNER COLLEGE OF MEDICINE AT THE UNIVERSITY OF VERMONT

Research News

Youth brain activation data from the largest longitudinal neuroimaging study to date provides valuable new information on the cognitive processes and brain systems that underlie adolescent development and might contribute to mental and physical health challenges in adulthood. The study published today online in Nature Neuroscience.

Because of the notable brain, cognitive, and emotional maturation - and emergence of many mental health disorders - that occurs between the ages of 10 and 20, understanding neurodevelopment and how it is impacted by the numerous risk factors that emerge during this timeframe is a critical area of interest. However, to date, most human neuroimaging studies have focused on adult functioning.

The Adolescent Brain Cognitive Development Study (ABCD) Study, which launched in 2016, is a multisite, 10-year-long longitudinal study that has enrolled nearly 12,000 youth aged 9 to 10 at 21 research sites around the country.

These latest findings demonstrate which brain regions are involved in a range of important psychological processes, including cognitive control, reward processing, working memory, and social/emotional function.

Using functional magnetic resonance imaging (fMRI) technology, the researchers observed brain activation during a battery of three different tasks and identified how differences in the patterns of activity related to individual differences in these processes.

"This study - likely the biggest task activation paper ever - shows the brain regions activated by each task, how well they capture individual differences, and will likely serve as a baseline for all the subsequent papers that will track the kids as they age," says Hugh Garavan, Ph.D., professor of psychiatry at the University of Vermont, and a senior author on the study.

The brain maps aim to improve scientists' understanding of the psychological processes that put young people at higher risk for developing mental and physical health challenges and, by identifying the brain correlates of factors that influence development, can give guidance on which interventions could help improve outcomes.

"These brain activation maps and spatial reproducibility findings will serve as a gold standard for the neuroscientific community and could help inform study design," says Bader Chaarani, Ph.D., assistant professor of psychiatry at the University of Vermont and the study's first author.

The study's authors state that these brain activation maps will allow for "cross-sectional analyses of inter-individual and group differences," as well as "offer the potential for examining baseline predictors of future development and behavior and for quantifying changes in brain function that may arise from the numerous influences expected to affect development and behavior."

###

Most of the findings are available online at ABCDStudy.org for all scientists to access and use in their research.

NOTE: Data used in the preparation of this article were obtained from the Adolescent Brain Cognitive DevelopmentSM Study, held in the National Institute for Mental Health Data Archive. ABCD consortium investigators designed and implemented the study and/or provided data but did not necessarily participate in analysis or writing of this report. The views expressed in this manuscript are those of the authors and do not necessarily reflect the official views of the National Institutes of Health, the Department of Health and Human Services, the United States federal government, or ABCD consortium investigators.

 ALSO IN CANADA

Climate change a bigger threat to landscape biodiversity than emerald ash borer

PENN STATE

Research News

"We really wanted to focus on isolating the impact of the emerald ash borer on biodiversity, forest composition, biomass and other factors," said Stacey Olson, program coordinator and legal assistant at Resources Legacy Fund. Olson completed the research as part of her master's thesis at Penn State. "We found that emerald ash borer and its impact on ash trees has serious implications for forest change at the site level, but at the broad landscape level, the climatic changes over the next century were much more important in terms of forest composition and species diversity."

The researchers used a forest simulation model to examine the effects of the emerald ash borer and climate change on a forested area of northeast Wisconsin through the year 2100. The area includes the Menominee Reservation. The Menominee Indian Tribe of Wisconsin has been sustainably harvesting timber from the forest for more than 150 years. The scientists reported their findings in the journal Ecosystems.

The model took into account how trees grow, disperse seeds, die and interact with disturbances such as climatic changes. It also accounted for emerald ash borer infestation and pre-emptive ash tree removal, an Indigenous forest management strategy.

"When we run the model, all of these components work simultaneously and interact with one another across the landscape and across time," said Olson, who was also an Environmental Scholar in the Earth and Environmental Systems Institute (EESI) at Penn State. "The model gives us a picture of what all these interacting disturbance and succession processes look like."

The researchers looked at moderate and high-level climate change scenarios based on a business-as-usual approach that fails to curb greenhouse gas emissions within the next decade. The landscape itself partly drove the team's decision to study these scenarios, Olson said. The forest sits in what scientists call a tension zone, where vegetation, soil type and climate variability can shift quickly as one moves across the landscape. In these areas, changes in climate can result in drastic changes on the ground.

The model showed a shift in the types of trees present in the study area by 2100. Northern hardwoods, like beech and birch trees, decreased by approximately 12%, to 35% of the total biomass of all species in the forest. Under climate change conditions, southern hardwoods, like black cherry trees, increased from 4% to 23% by mid-century and became the dominant species in the southern part of the study area.

The model showed that in some areas, the emerald ash borer would completely remove ash trees, clearing the way for other species to replace them. Ash, however, is not the dominant species in the forest, and its removal cannot account for the large shift in tree composition from northern to southern hardwoods. The researchers attributed this shift to climatic changes, such as warmer temperatures and periods of drought or water scarcity, identified in the models.

The study led by Olson is part of a larger project, Visualizing Forest Futures (ViFF), led by Erica Smithwick, distinguished professor of geography and EESI associate, and done in collaboration with the Menominee. The project combines Indigenous forest-management practices with cutting-edge modeling and visualization techniques to better understand the connections between human values and forests and how to sustainably manage forest resources.

"Our project seeks to guide decision-making about forest management strategies while also accounting for uncertainties in forest changes under future climates," Smithwick said. "Stacey's paper provides key information to help guide that process."

The team's research demonstrates the importance of focusing more resources on climate change mitigation rather than on a very specific, targeted threat like the emerald ash borer.

"Some of the research I found said that the ash will likely become functionally extinct within the next couple of decades," said Olson. "While this is clearly a serious problem that deserves attention, a big takeaway for me is that climate is an even larger driver of forest change over the next century, and addressing these challenges must be a top priority."

###

Also participating in this research were Melissa Lucash, University of Oregon; Robert Scheller, North Carolina State University; Robert Nicholas, Penn State; Kelsey Ruckert, RPS Ocean Science and formerly a scientific programmer at Penn State when the research was conducted; and Christopher Caldwell, College of Menominee Nation.

The National Science Foundation; NASA Pennsylvania Space Grant Consortium; and Penn State, through EESI, the Center for Landscape Dynamics, the E. Willard Miller Award in Geography, the Herbert and Mary B. Hughes Fund and the Department of Geography, supported this research.

 

Monoclonal antibody prevents HIV infection in monkeys, study finds

Leronlimab to be studied as potential HIV PrEP drug in an early human clinical trial

OREGON HEALTH & SCIENCE UNIVERSITY

Research News

An experimental, lab-made antibody can completely prevent nonhuman primates from being infected with the monkey form of HIV, new research published in Nature Communications shows.

The results will inform a future human clinical trial evaluating leronlimab as a potential pre-exposure prophylaxis, or PrEP, therapy to prevent human infection from the virus that causes AIDS.

"Our study findings indicate leronlimab could be a new weapon against the HIV epidemic," said the study's lead researcher and co-corresponding author of this paper, Jonah Sacha, Ph.D., an Oregon Health & Science University professor at OHSU's Oregon National Primate Center and Vaccine & Gene Therapy Institute.

"The results of this pre-clinical study, targeting the HIV co-receptor CCR5, have the potential to be groundbreaking as we essentially have a tool that can mimic the genetic mutations of CCR5 that render some individuals immune to infection and have led in part to two cases of a cure of HIV," said the other co-corresponding author, Lishomwa Ndhlovu, M.D., Ph.D., a professor of immunology in medicine at Weill Cornell Medicine in New York.

Made by Vancouver, Washington-based CytoDyn, the monoclonal antibody blocks HIV from entering immune cells through a surface protein called CCR5. The injectable drug has already been studied in a Phase 3 trial as a potential treatment for people living with HIV when used in combination with standard antiretroviral medications. CytoDyn is in the process of submitting information to the FDA to request its approval for that use. This study, however, specifically examined preventing HIV infection to begin with.

Some PrEP drugs are already available, but they can lead to adverse side effects such as liver, heart and bone problems, and some people are resistant to them due to genetic mutations in HIV. Existing PrEP options typically require frequent use, such as a pill daily, or are infusions that must be given in a clinic. Leronlimab is designed to be a self-administered injection.

To study leronlimab's effectiveness as a potential PrEP drug, the research team created three groups of six rhesus macaques at OHSU's Oregon National Primate Research Center. Two groups received different doses of leronlimab, while the third served as a control that didn't receive the experimental drug.

Macaques that received the higher dose of 50 milligrams per kilogram of the animal's weight every other week were completely protected from the monkey form of HIV. In contrast, two of the animals that received the lower dose of 10 milligrams per kilogram per week became infected, and every animal in the control group became infected. Researchers concluded the low-dose group's partial protection was likely due to monkey immune responses against the human antibody.

Following this study's results, CytoDyn is planning to conduct an early clinical trial investigating leronlimab as a potential PrEP drug in people within the next year. Human doses would likely be lower than those given in this study, as rhesus macaque cells have more surface CCR5 protein than humans.

In the meanwhile, Sacha is already trying to make leronlimab easier to use. He received a five-year, $3-million NIH grant in August 2020 to develop a concentrated, longer-lasting formulation of leronlimab that could allow it to be injected every three months. Less-frequent injections can increase drug regimen adherence, and therefore improve drug effectiveness.

The research team dedicated this study to Timothy Ray Brown, who died Sept. 29, 2020, and was known as the Berlin patient for being the first person to be cured of HIV. While living in Berlin in 2007, Brown underwent a bone morrow transplant to treat his blood cancer. The procedure eliminated HIV in Brown because the transplanted bone marrow came from a donor who had a rare mutation that eliminated the CCR5 gene, which makes the surface protein through which HIV enters cells. Sacha became friends with Brown after meeting him at an AIDS conference in 2015. Brown is also a co-author on the paper, and inspired scientists working on this research.

###

REFERENCE: Xiao L. Chang, Gabriela M. Webb, Helen L. Wu, Justin M. Greene, Shaheed Abdulhaqq, Kathrine B. Bateman, Jason S. Reed, Cleiton Pessoa, Whitney C. Weber, Nicholas Maier, Glen M. Chew, Roxanne M. Gilbride, Lina Gao, Rebecca Agnor, Travis Giobbi, Jeffrey Torgerson, Don Siess, Nicole Burnett, Miranda Fischer, Oriene Shiel, Cassandra Moats, Bruce Patterson, Kush Dhody, Scott Kelly, Nader Pourhassan, Diogo M. Magnani, Jeremy Smedley, Benjamin N. Bimber, Nancy L. Haigwood, Scott G. Hansen, Timoty R. Brown, Lishomwa C. Ndhlovu, Jonah B. Sacha, Antibody-based CCR5 Blockade Protects Macaques from Mucosal SHIV Transmission, Nature Communications, June 7, 2021, DOI: 10.1038/s41467-021-23697-6, https://www.nature.com/articles/s41467-021-23697-6.

This research was supported by the National Institute of Allergy and Infectious Diseases (grants R01 AI129703, R01 AI54559, R21 AI54559, K01 OD026561, U24 AI126683) and the National Institutes of Health's Office of the director (Oregon National Primate Research Center Core grant P51 OD011092).

In our interest of ensuring the integrity of our research and as part of our commitment to public transparency, OHSU actively regulates, tracks and manages relationships that our researchers may hold with entities outside of OHSU. In regards to this research, Jonah Sacha has a significant financial interest in CytoDyn, a company that may have a commercial interest in the results of this research and technology. Additionally, Dr. Lishomwa Ndhlovu receives an annual stock option grant to purchase CytoDyn common stock as a member of CytoDyn's Scientific Advisory Board.

Related OHSU News stories:

Other links: