Saturday, August 17, 2024

 SPACE

Galaxies in dense environments tend to be larger, settling one cosmic question and raising others




University of Washington

Different shapes of galaxies 

image: 

Images of galaxies of a variety of shapes and sizes. New research shows that galaxies with more nearby neighbors tend to be larger.

view more 

Credit: NAOJ/NASA/ESA/CSA




Link to release:

https://www.washington.edu/news/2024/08/14/galaxy-size/

 

FROM: James Urton

University of Washington

206-543-2580

jurton@uw.edu 

(Note: researcher contact information at the end)

For decades, scientists have known that some galaxies reside in dense environments with lots of other galaxies nearby. Others drift through the cosmos essentially alone, with few or no other galaxies in their corner of the universe.

A new study has found a major difference between galaxies in these divergent settings: Galaxies with more neighbors tend to be larger than their counterparts, which have a similar shape and mass, but reside in less dense environments. In a paper published Aug. 14 in the Astrophysical Journal, researchers at the University of Washington, Yale University, the Leibniz Institute for Astrophysics Potsdam in Germany and Waseda University in Japan report that galaxies found in denser regions of the universe are as much as 25% larger than isolated galaxies.

The research, which used a new machine-learning tool to analyze millions of galaxies, helps resolve a long-standing debate among astrophysicists over the relationship between a galaxy’s size and its environment. The findings also raise new questions about how galaxies form and evolve over billions of years.

“Current theories of galaxy formation and evolution cannot adequately explain the finding that clustered galaxies are larger than their identical counterparts in less dense regions of the universe,” said lead author Aritra Ghosh, a UW postdoctoral researcher in astronomy and an LSST-DA Catalyst Fellow with the UW’s DiRAC Institute. “That’s one of the most interesting things about astrophysics. Sometimes what the theories predict we should find and what a survey actually finds are not in agreement, and so we go back and try to modify existing theories to better explain the observations.”

Past studies that looked into the relationship between galaxy size and environment came up with contradictory results. Some determined that galaxies in clusters were smaller than isolated galaxies. Others came to the opposite conclusion. The studies were generally much smaller in scope, based on observations of hundreds or thousands of galaxies.

In this new study, Ghosh and his colleagues utilized a survey of millions of galaxies conducted using the Subaru Telescope in Hawaii. This endeavor, known as the Hyper Suprime-Cam Subaru Strategic Program, took high-quality images of each galaxy. The team selected approximately 3 million galaxies with the highest-quality data and used a machine learning algorithm to determine the size of each one. Next, the researchers essentially placed a circle — one with a radius of 30 million light years — around each galaxy. The circle represents the galaxy’s immediate vicinity. They then asked a simple question: How many neighboring galaxies lie within that circle?

The answer showed a clear general trend: Galaxies with more neighbors were also on average larger.

There could be many reasons why. Perhaps densely clustered galaxies are simply larger when they first form, or are more likely to undergo efficient mergers with close neighbors. Perhaps dark matter — that mysterious substance that makes up most of the matter in the universe, yet cannot be detected directly by any current means – plays a role. After all, galaxies form within individual “halos” of dark matter and the gravitational pull from those halos plays a critical role in how galaxies evolve.

“Theoretical astrophysicists will have to perform more comprehensive studies using simulations to conclusively establish why galaxies with more neighbors tend to be larger,” said Ghosh. “For now, the best we can say is that we’re confident that this relationship between galaxy environment and galaxy size exists.”

Utilizing an incredibly large dataset like the Hyper Suprime-Cam Subaru Strategic Program helped the team reach a clear conclusion. But that’s only part of the story. The novel machine learning tool they used to help determine the size of each individual galaxy also accounted for inherent uncertainties in the measurements of galaxy size.

“One important lesson we had learned prior to this study is that settling this question doesn’t just require surveying large numbers of galaxies,” said Ghosh. “You also need careful statistical analysis. A part of that comes from machine learning tools that can accurately quantify the degree of uncertainty in our measurements of galaxy properties.”

The machine learning tool that they used is called GaMPEN — or Galaxy Morphology Posterior Estimation Network. As a doctoral student at Yale, Ghosh led development of GaMPEN, which was unveiled in papers published in 2022 and 2023 in the Astrophysical Journal. The tool is freely available online and could be adapted to analyze other large surveys, said Ghosh.

Though this new study focuses on galaxies, it also forecasts the types of research — centered on complex analyses of incredibly large datasets — that will soon take astronomy by storm. When a generation of new telescopes with powerful cameras, including the Vera C. Rubin Observatory in Chile, come online, they will collect massive amounts of data on the cosmos every night. In anticipation, scientists have been developing new tools like GaMPEN that can utilize these large datasets to answer pressing questions in astrophysics.

“Very soon, large datasets will be the norm in astronomy,” said Ghosh. “This study is a perfect demonstration of what you can do with them — when you have the right tools.”

Co-authors on the study are Meg Urry, professor of physics and of astronomy at Yale; Meredith Powell, a research fellow with the Leibniz Institute; Rhythm Shimakawa, associate professor at Waseda University; Frank van den Bosch, a Yale professor of astronomy; Daisuke Nagai, professor of physics and of astronomy at Yale; Kaustav Mitra, a doctoral student at Yale; and Andrew Connolly, professor of astronomy at the UW and faculty member in the DiRAC Institute and the eScience Institute. The research was funded by NASA, the Yale Graduate School of Arts & Sciences, the John Templeton Foundation, the Charles and Lisa Simonyi Fund for Arts and Sciences, the Washington Research Foundation and the UW eScience Institute.

Grant numbers

  • NASA: 80NSSC23K0488
  • John Templeton Foundation: 62192

Chicxulub impactor was a carbonaceous-type asteroid from beyond Jupiter




American Association for the Advancement of Science (AAAS)





Scientists have pinpointed the origin and composition of the asteroid that caused the mass extinction 66 million years ago, revealing it was a rare carbonaceous asteroid from beyond Jupiter, according to a new study. The findings help resolve long-standing debates about the nature of Chicxulub impactor, reshaping our understanding of Earth's history and the extraterrestrial rocks that have collided with it. Earth has experienced several mass extinction events. The most recent event occurred 66 million years ago at the boundary between the Cretaceous and Paleogene eras (K-Pg boundary) and resulted in the loss of roughly 60% of the planet’s species, including non-avian dinosaurs. The Chicxulub impactor, a massive asteroid that collided with Earth in what is now the Gulf of Mexico, is thought to have played a key role in this extinction event. Evidence includes high levels of platinum-group elements (PGEs) like iridium, ruthenium, osmium, rhodium, platinum, and palladium in K-Pg boundary layers, which are rare on Earth but common in meteorites. These elevated PGE levels have been found globally, suggesting the impact spread debris worldwide. While some propose large-scale volcanic activity from the Deccan Traps igneous province as an alternative source of PGEs, the specific PGE ratios at the K-Pg boundary align more with asteroid impacts than volcanic activity. However, much about the nature of the Chicxulub impactor – its composition and extraterrestrial origin – is poorly understood.

To address these questions, Mario Fischer-Gödde and colleagues evaluated ruthenium (Ru) isotopes in samples taken from the K-Pg boundary. For comparison, they also analyzed samples from five other asteroid impacts from the last 541 million years, samples from ancient Archaean-age (3.5 – 3.2 billion-years-old) impact-related spherule layers, and samples from two carbonaceous meteorites. Ficher-Gödde et al. found that the Ru isotope signatures in samples from the K-Pg boundary were uniform and closely matched those of carbonaceous chondrites (CCs), not Earth or other meteorite types, suggesting that the Chicxulub impactor likely came from a C-type asteroid that formed in the outer Solar System. They also rule out a comet as the impactor. Ancient Archean samples also suggest impactors with a CC-like composition, indicating a similar outer Solar System origin and perhaps representing material that impacted during Earth’s final stages of accretion. In contrast, other impact sites from different periods showed Ru isotope compositions consistent with S-type (salicaceous) asteroids from the inner Solar System.

Podcast: A segment of Science's weekly podcast, related to this research, will be available on the Science.org podcast landing page after the embargo lifts. Reporters are free to make use of the segments for broadcast purposes and/or quote from them – with appropriate attribution (i.e., cite "Science podcast"). Please note that the file itself should not be posted to any other Web site.

 

Methamphetamine-involved psychiatric hospitalizations have increased, study says



While most psychiatric hospitalizations did not involve substances, methamphetamine-related encounters increased while opioid-involved encounters decreased



University of Colorado Anschutz Medical Campus





AURORA, Colo. (August 16, 2024) – A new study, out now in Drug and Alcohol Dependence, that details trends among psychiatric hospitalizations between 2015-2019 finds that while most hospitalizations did not involve any substances, methamphetamine-related hospitalizations have increased while overall number of psychiatric hospitalizations remained stable.

Additionally, researchers detail that psychiatric hospitalizations caused by methamphetamine use were highest in the Mountain West region but were also shifting geographically. “Rates of methamphetamine-involved psychiatric hospitalizations with were by far the highest in the Mountain West. As expected, this mirrors rates of self-reported methamphetamine use and methamphetamine-related overdose deaths in the Mountain West.” says Susan Calcaterra, MD, MPH, professor at the University of Colorado Anschutz Medical Campus and study lead author. “Psychiatric hospitalizations involving methamphetamine use is really taking off in the Midwest and Northeast, in particular.”

While rates of methamphetamine-related psychiatric hospitalizations increased 68% over the study period, opioid-related hospitalizations decreased by 22%. Methamphetamine rate increases may be attributed to methamphetamines ubiquitousness and affordability, as well as the lack of resources available to manage methamphetamine use.  Why opioid-involved psychiatric hospitalizations declined is less clear but may be related to the lethality of fentanyl.

“An important takeaway from this study is the need for resources to address the mental and physical treatment of methamphetamine use,” says Calcaterra.

“While the vast majority of psychiatric hospitalizations in this timeframe did not involve substance use, the significant increase in methamphetamine use means we have to better consider harm reduction in clinical settings. Evidence-based interventions such as contingency management which involves offering incentives for abstinence, harm reduction education, provision of naloxone for overdose reversal and access to expanded mental health treatments are proven to help mitigate dangerous effects from methamphetamine use, especially when contaminated with fentanyl much like the campaigns aimed at public awareness around opioid use.”

About the University of Colorado Anschutz Medical Campus

The University of Colorado Anschutz Medical Campus is a world-class medical destination at the forefront of transformative science, medicine, education and patient care. The campus encompasses the University of Colorado health professional schools, more than 60 centers and institutes, and two nationally ranked independent hospitals - UCHealth University of Colorado Hospital and Children's Hospital Colorado – which see more than 2 million adult and pediatric patient visits yearly. Innovative, interconnected and highly collaborative, the University of Colorado Anschutz Medical Campus delivers life-changing treatments, patient care and professional training and conducts world-renowned research fueled by $705 million in research grants. For more information, visit www.cuanschutz.edu.

 

Knocking out one key gene leads to autistic traits


RFK JR. WAS WRONG, AGAIN!

Rockefeller University
Purkinje cells 

image: 

Purkinje cells in the cerebellum, stained and magnified 63 times, revealing fine details of the dendritic spines.

view more 

Credit: Laboratory of Developmental Neurobiology at The Rockefeller University




More than 70 genes have been linked to autism spectrum disorder (ASD), a developmental condition in which differences in the brain lead to a host of altered behaviors, including issues with language, social communication, hyperactivity, and repetitive movements. Scientists are attempting to tease out those specific associations gene by gene, neuron by neuron.

One such gene is Astrotactin 2 (ASTN2). In 2018, researchers from the Laboratory of Developmental Neurobiology at Rockefeller University discovered how defects in the protein produced by the gene disrupted circuitry in the cerebellum in children with neurodevelopmental conditions.

Now the same lab has found that knocking out the gene entirely leads to several hallmark behaviors of autism. As they describe in a new paper in PNAS, mice that lacked ASTN2 showed distinctly different behaviors from their wild-type nestmates in four key ways: they vocalized and socialized less but were more hyperactive and repetitive in their behavior.

“All of these traits have parallels in people with ASD,” says Michalina Hanzel, first author of the paper. “Alongside these behaviors, we also found structural and physiological changes in the cerebellum.”

“It’s a big finding in the field of neuroscience,” says lab lead Mary E. Hatten, whose work has focused on this brain region for decades. “It also underscores this emerging story that the cerebellum has cognitive functions that are quite independent of its motor functions.”

An unexpected role

In 2010, Hatten’s lab discovered that proteins produced by the ASTN2 gene help guide neurons as they migrate during the development of cerebellum and form its structure. In the 2018 study, they examined a family in which three children had both neurodevelopmental disorders and ASTN2 mutations. They found that in a developed brain, the proteins have a similar guiding role: they keep the chemical conversation between neurons going by ushering receptors off the neural surfaces to make room for new receptors to rotate in. In a mutated gene, the proteins fail to act and the receptors pile up, resulting in a traffic jam that hinders neuronal connections and communication. This impact could be seen in the children’s afflictions, which included intellectual disability, language delays, ADHD, and autism.

The find was part of a growing body of evidence that the cerebellum—the oldest cortical structure in the brain—is important not just for motor control but also for language, cognition, and social behavior.

For the current study, Hanzel wanted to see what effects a total absence of the ASTN2 gene might have on cerebellar structure and on behavior. Collaborating with study co-authors Zachi Horn, a former postdoc in the Hatten lab, and with assistance from Shiaoching Gong, of Weill Cornell Medicine, Hanzel spent two years creating a knockout mouse that lacked ASTN2, and then studied the brains and activity of both infant and adult mice.

Behavioral parallels

The knockout mice participated in several noninvasive behavioral experiments to see how they compared to their wild-type nestmates. The knockout mice showed distinctly different characteristics in all of them.

In one study, the researchers briefly isolated baby mice, then measured how frequently they called out for their mothers using ultrasonic vocalizations. These sounds are a key part of a mouse’s social behavior and communication, and they’re one of the best proxies researchers have for assessing parallels to human language skills.

The wild-type pups were quick to call for their mothers using complex, pitch-shifting sounds, while the knockout pups gave fewer, shorter calls within a limited pitch range.

Similar communication issues are common in people with ASD, Hanzel says. “It’s one of the most telling characteristics, but it exists along a spectrum,” she says. “Some autistic people don’t understand metaphor, while others echo language they’ve overheard, and still others do not speak at all.”

In another experiment, the researchers tested how ASTN2 mice interacted with both familiar and unfamiliar mice. They preferred to interact with a mouse they knew rather than one they didn’t. In contrast, wild-type mice always choose the social novelty of a new face.

This, too, has parallels in human ASD behavior, with a reluctance towards unfamiliar environments and people being common, Hanzel adds. “That’s a very important result, because it shows that mice with the knockout mutation do not like social novelty and prefer to spend time with mice they know, which corresponds to people with ASD, who tend to like new social interactions less than familiar ones.”

In a third experiment, both types of mice were given free rein to explore an open space for an hour. The ASTN2 mice traveled a significantly longer distance than the other mice, and engaged in repetitive behaviors, such as circling in place, 40% more. Both hyperactivity and repetitive behaviors are well-known hallmarks of ASD.

Miscommunication between brain regions

When they analyzed the brains of the ASTN2 mice, they found a few small but apparently potent structural and physiological changes in the cerebellum. One was that large neurons called Purkinje cells had a higher density of dendritic spines, structures that are spotted with the synapses that send neural signals. But they only detected this change in distinct areas of the cerebellum. “For example, we found the biggest difference in the posterior vermis region, where repetitive and inflexible behaviors are controlled,” Hanzel says.

The scientists also found a decrease in the number of immature dendritic spines known as filopodia and the volume of Bergmann glial fibers, which help with cell migration.

“The differences are quite subtle, but they are clearly affecting how the mice are behaving,” Hatten says. “The changes are probably altering the communication between the cerebellum and the rest of the brain.”

In the future, the researchers plan to study human cerebellar cells, which they’ve been developing for a half-dozen years from stem cells, as well as cells with ASTN2 mutations that were donated by the family in the 2018 study.

“We’d like to see if we can find parallel differences to what we found in mice in human cells,” Hatten says.

She continues, “We also want to look at the detailed biology of other genes that are associated with autism. There are dozens of them, but there’s no agreed-upon commonality that binds them together. We’re very excited that we’ve been able to show in detail what ASTN2 does, but there are a lot more genes to investigate.”

 

Nitrogen interventions as a key to better health and robust ecosystems



International Institute for Applied Systems Analysis
An integrated modeling framework evaluates how ambitious nitrogen interventions can reduce ammonia and nitrogen emissions 

image: 

Fig.1 An integrated modeling framework evaluates how ambitious nitrogen interventions can reduce ammonia and nitrogen emissions, consequently improve air quality by lowering fine particulate matter and ozone levels, and decrease nitrogen deposition, ultimately benefiting the environment and public health.

view more 

Credit: Guo Y., Zhao H., Winiwarter W., Chang J., Wang X., Zhou M., Havlik P., Leclere D., Pan D., Kanter D., Zhang L. (2024) Aspirational Nitrogen Interventions Accelerate Air Pollution Abatement and Ecosystem Protection, Science Advances, doi.org/10.1126/sciadv.ado0112.




The Earth’s nitrogen cycle is among the most heavily exceeded planetary boundaries. Agricultural production and fossil fuel burning release nitrogen pollutants like ammonia (NH3), nitrogen oxides (NOx), and nitrous oxide (N2O), which contribute to air pollution and damage ecosystems. These pollutants harm human health, crops, and ecosystems. Given the growing global energy and food demand, this damage is expected to increase even further.

The potential of nitrogen pollution mitigation technologies and policies – so-called “nitrogen interventions” – for improving air quality and reducing impacts on ecosystems, has been underexplored. A gap exists between traditional nitrogen budget research, which tracks nitrogen flows across air, water, and soil, but lacks detail on biogeochemical transformations, and Earth science research, which models these transformations but usually focuses on a single environmental medium.  

To address this knowledge gap, an international research team combined multidisciplinary methods to evaluate how nitrogen interventions could improve air quality and reduce nitrogen deposition. Their study, published in Science Advances, found that interventions, such as improving fuel combustion conditions, increasing agricultural nitrogen use efficiency, and reducing food loss and waste, could significantly lower premature deaths attributed to air pollution, crop losses, and ecosystems risks. While typically considered for individual goals like air or water quality, recognizing the broad co-benefits of nitrogen management is critical for future policy design and effective pollution control.  

“We established an integrated assessment framework combining future nitrogen policy scenarios with integrated assessment models, air quality models and dose-response relationships to assess how ambitious measures can reduce air pollution and ecosystems damage at detailed geographic levels,” explains lead author Yixin Guo, a postdoctoral researcher jointly appointed by Peking University and IIASA.

The study shows that by 2050, high-ambition nitrogen interventions could cut global ammonia and nitrogen oxides emissions by 40% and 52%, respectively, compared to 2015 levels. This would reduce air pollution, preventing 817,000 premature deaths, lower ground-level ozone concentrations, and cut crop yield losses. Without these interventions, environmental damage will worsen by 2050, with Africa and Asia being affected the most. On the other hand, should those measures be implemented, Africa and Asia would benefit from them the most. 

“We found that nitrogen interventions offer increasing benefits over time, with greater effects by 2050 than by 2030. The biggest reductions in ammonia and nitrogen oxides are expected in East and South Asia, mainly through improved crop practices and technology adoption in industrial sectors. These reductions will help lower air pollutions levels, making it easier for many regions to achieve the World Health Organization interim targets. Additionally, as populations grow, the health benefits of these interventions will increase, especially in developing areas,” adds Yixin.

“Our results highlight that nitrogen interventions can significantly help achieve multiple SDGs, including Good Health and Well-being (SDG3), Zero Hunger (SDG2), Responsible Consumption and Production (SDG12), and Life on Land (SDG15),” says Lin Zhang, a study coauthor and tenured associated professor at Department of Atmospheric and Oceanic Sciences at Peking University.

“This collaborative research shows how IIASA research can be rolled out globally. Solutions for environmental impacts will vary by region, enabling customized policy recommendations, even for complex issues like nitrogen pollution,” concludes Wilfried Winiwarter, a study coauthor and senior researcher in the Pollution Management Research Group of the IIASA Energy, Climate, and Environment Program.   

 

Reference:

Guo Y., Zhao H., Winiwarter W., Chang J., Wang X., Zhou M., Havlik P., Leclere D., Pan D., Kanter D., Zhang L. (2024) Aspirational Nitrogen Interventions Accelerate Air Pollution Abatement and Ecosystem Protection, Science Advanceshttps://doi.org/10.1126/sciadv.ado0112.

 

About IIASA:

The International Institute for Applied Systems Analysis (IIASA) is an international scientific institute that conducts research into the critical issues of global environmental, economic, technological, and social change that we face in the twenty-first century. Our findings provide valuable options to policymakers to shape the future of our changing world. IIASA is independent and funded by prestigious research funding agencies in Africa, the Americas, Asia, and Europe.

 

 

ASU researchers help to control cancer-causing poison in corn



Study introduces X-ray irradiation as sterilization method for contaminated corn



Arizona State University

Hannah Glesener 

image: 

Hannah Glesener

view more 

Credit: The Biodesign Institute at Arizona State University




Corn, a staple food crop consumed by billions of people and animals worldwide, is frequently contaminated by the fungal toxin aflatoxin B1, a highly potent carcinogen produced by the fungus Aspergillus flavus.

Exposure to aflatoxin poses serious health risks to humans and other animals, and presents economic challenges to agricultural industries. However, due to the highly transmissible nature of the fungus coupled with the toxicity of the toxin, studying and developing control techniques in a laboratory setting can be difficult.

In a new study published in the journal Toxins, researchers at Arizona State University and their international colleagues have demonstrated a promising sterilization technique that uses X-ray irradiation to reduce Aspergillus flavus viability in contaminated corn. This method achieves sterilization without degrading the harmful aflatoxin B1 (AFB1) produced by the fungus.

By disabling Aspergillus flavus, the method stops the fungus from transmitting spores and producing more aflatoxins. This is crucial to allowing more laboratories to join the fight against fungal toxin prevention and control. Stabilizing toxin levels allows scientists to develop and test additional remediation techniques that target aflatoxin degradation without the complication of ongoing fungal growth. Results showed that a small dose of radiation shut down the fungal growth of Aspergillus flavus.

This work is part of a larger effort from Arizona State University researchers and international partners, supported by the National Institutes of Health, to identify low-cost approaches to mitigate Aflatoxin transmission and exposure among marginalized communities.

“We have known about aflatoxin since the 1960s, yet it is still a pervasive problem,” says Hannah Glesener, lead author of the new study. “X-ray irradiating naturally contaminated corn is an exciting step that supports our research team’s work on developing solutions for aflatoxin-related challenges, such as chronic malnutrition.”

Glesener is a graduate research assistant in the Biodesign Center for Health Through Microbiomes and a biological design PhD student in ASU’s School for Engineering of Matter, Transport and Energy.

The team is now evaluating household-level cooking strategies for controlling this fungal toxin as well as the role of the human gut microbiome in potentially detoxifying foods prior to absorption into the bloodstream.

The global challenge of mycotoxin contamination

Aflatoxins are a type of mycotoxin, which are naturally occurring toxic compounds produced by molds or fungi that can grow on a variety of crops. Mycotoxins, including aflatoxins, have potent carcinogenic properties.

Aflatoxins are produced by Aspergillus species and are commonly found in crops including corn, cottonseed and nuts, particularly in warm and humid environments where mold thrives. Aflatoxin-producing fungi can contaminate crops at various stages, including in the field, during harvest and while in storage.

Aflatoxin contamination is a significant global concern, especially in humid, tropical and subtropical regions. It is most prevalent in Africa, Asia and parts of South America, where warm conditions can jump-start the growth of Aspergillus species.

An estimated 25% of the world’s food crops are affected by mycotoxins, including aflatoxins, according to the Food and Agriculture Organization of the United Nations. Nigeria, Kenya, India and China are particularly affected due to their climate and agricultural practices.

Dangerous health effects of aflatoxin contamination

Acute aflatoxin poisoning, called aflatoxicosis, can occur when large quantities of contaminated food are consumed. Symptoms include liver damage, nausea, vomiting, abdominal pain and, in severe cases, death.

Aflatoxins are particularly associated with an increased risk of liver cancer. The International Agency for Research on Cancer classifies aflatoxins as Group 1 carcinogens, scientifically proven to cause cancer in humans. Chronic exposure can also lead to stunted growth in children and immunosuppression, increasing susceptibility to infectious diseases.

The World Health Organization estimates that aflatoxins contribute to approximately 5% to 28% of global cases of liver cancer, with the highest burden in sub-Saharan Africa, Southeast Asia and China. Every year, aflatoxin exposure is estimated to cause 25,000 to 155,000 liver cancer deaths worldwide. Further, the effects of aflatoxin exposure, even at lower levels, are more severe in animals.

Climate change is believed to exacerbate the threat posed by aflatoxins by expanding the geographic range of aflatoxin-producing fungi, potentially increasing contamination risks in new regions. Additionally, the economic burden caused by aflatoxin contamination is considerable, particularly in the developing world.

Study overview

The main objective of the study, led by corresponding author and Assistant Research Professor Lee Voth-Gaeddert, was to determine the optimal irradiation dose needed to eliminate fungal viability while preserving aflatoxin B1 concentrations for subsequent detoxification studies.

These results open new avenues for safely handling and researching contaminated food products without compromising the structural and chemical properties essential for scientific analysis. It can hopefully lead to new approaches for scalable and effective solutions to mycotoxin contamination applicable across various regions, particularly in developing countries where food safety measures are often limited.

  

Corn at a market in Guatemala. Aflatoxin, produced by the fungus Aspergillus flavus contaminates corn in many parts of the world, posing a serious risk to public health. 

Credit

The Biodesign Institute at Arizona State University

 

Why do researchers often prefer safe over risky projects? Explaining risk aversion in science



Reward schemes that motivate effort inherently discourage scientific risk-taking


PLOS




A mathematical framework that builds on the economic theory of hidden-action models provides insight into how the unobservable nature of effort and risk shapes investigators’ research strategies and the incentive structures within which they work, according to a study published August 15th in the open-access journal PLOS Biology by Kevin Gross from North Carolina State University, U.S., and Carl Bergstrom from the University of Washington, U.S. 

Scientific research requires taking risks, as the most cautious approaches are unlikely to lead to the most rapid progress. Yet much funded scientific research plays it safe and funding agencies bemoan the difficulty of attracting high-risk, high-return research projects. Gross and Bergstrom adapted an economic contracting model to explore how the unobservability of risk and effort discourages risky research.

The model considers a hidden-action problem, in which the scientific community must reward discoveries in a way that encourages effort and risk-taking while simultaneously protecting researchers’ livelihoods against the unpredictability of scientific outcomes. Its challenge when doing so is that incentives to motivate effort clash with incentives to motivate risk-taking, because a failed project may be evidence of a risky undertaking but could also be the result of simple sloth. As a result, the incentives that are needed to encourage effort do actively discourage risk-taking.

Scientists respond by working on safe projects that generate evidence of effort but that don’t move science forward as rapidly as riskier projects would. A social planner who prizes scientific productivity above researchers’ well-being could remedy the problem by rewarding major discoveries richly enough to induce high-risk research, but in doing so would expose scientists to a degree of livelihood risk that ultimately leaves them worse off. Because the scientific community is approximately self-governing and constructs its own reward schedule, the incentives that researchers are willing to impose on themselves are inadequate to motivate the scientific risks that would best expedite scientific progress.

In deciding how to reward discoveries, the scientific community must contend with the fact that reward schemes that motivate effort inherently discourage scientific risk-taking, and vice versa. Because the community must motivate both effort and scientific risk-taking, and because effort is costly to investigators, the community inevitably establishes a tradition that encourages more conservative science than would be optimal for maximizing scientific progress, even when risky research is no more onerous than safer lines of inquiry.

The authors add, “Commentators regularly bemoan the dearth of high-risk, high-return research in science and suppose that this state of affairs is evidence of institutional or personal failings.  We argue here that this is not the case; instead, scientists who don't want to gamble with their careers will inevitably choose projects that are safer than scientific funders would prefer.”

#####

In your coverage, please use this URL to provide access to the freely available paper in PLOS Biologyhttp://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.3002750

Citation: Gross K, Bergstrom CT (2024) Rationalizing risk aversion in science: Why incentives to work hard clash with incentives to take risks. PLoS Biol 22(8): e3002750. https://doi.org/10.1371/journal.pbio.3002750

Author Countries: United States

Funding: This work was supported by National Science Foundation awards SMA 19-52343 and SMA 19-52069 to KG and CTB, respectively, and by Templeton World Charity Foundation grant 20650 to CTB. No funder played any role in the study design, analysis, decision to publish, or preparation of the manuscript.

 

*FREE* 

Four billion people in low- and middle-income countries lack safe drinking water




American Association for the Advancement of Science (AAAS)





More than 4.4 billion people in low- and middle-income countries lack access to safely managed drinking water, with fecal contamination affecting almost half the population of these regions, according to a new geospatial analysis. The findings reveal that previous global estimates of safe drinking water availability are greatly underestimated – particularly for some of the most vulnerable populations – and highlight an urgent need for targeted investments to improve water quality monitoring and infrastructure in these regions. Access to safe drinking water is a human right and is central to the United Nations (UN) 2030 Agenda for Sustainable Development. However, data on safely managed drinking water services (SMDWS) are lacking for much of the global population, especially at the subnational scale in low- and middle-income countries (LMICs). What’s more, the primary limiting factors affecting access to safe drinking water are largely unknown.

 

By combining household survey data with global Earth observation data and geospatial modeling techniques, Esther Greenwood and colleagues created detailed maps of SMDWS use across 135 LMICs. Greenwood et al. found that only one in three people in these countries had access to safely managed drinking water in 2020. This leaves an estimated 4.4 billion people in LMICs lacking SMDWS – roughly twice the estimate of 2 billion people in 2020 given by the World Health Organization and United Nations Children’s Fund Joint Monitoring Programme for Water Supply, Sanitation and Hygiene (JMP), the official UN program tasked with monitoring progress towards the Sustainable Development Goal on clean water access. The findings also show that SWDWS use in LMICs is primarily limited by fecal contamination, indicated by E. coli contamination in the primary drinking water source, and affects almost half of the population of these regions. Detection of fecal contamination in drinking water is concerning as ingestion of fecal pathogens is a major public health risk and driver of young child mortality globally. In a Perspective, Rob Hope discusses the study and its findings in greater detail.

 

Equity weighting increases the social cost of carbon, warrants careful dialogue




American Association for the Advancement of Science (AAAS)





In a Policy Forum, Brian Prest and colleagues discuss how new regulatory guidelines from the U.S. Office of Management and Budget (OMB), known as Circular A-4, could impact the social cost of carbon (SCC). The new equity weighting approach recommended by the OMB, they say, leads to a dramatic increase in SCC estimates, and thus requires careful dialogue and discussion. The social cost of carbon is an estimate of the economic damage caused by emitting an additional ton of carbon dioxide into the atmosphere. It helps guide decisions about balancing the costs of reducing emissions with the benefits of mitigating climate change. Traditionally, the SCC is influenced by the discount rate, which determines how future versus current economic impacts are valued. However, a new method called “equity weighting” that places a greater weight on benefits and costs affecting low-income groups has been introduced. Using the Greenhouse Gas Impact Value Estimator (GIVE) model – one of the three models used by the Environmental Protection Agency (EPA) to assess the impact of equity weighting on SCC estimates – Prest et al. show that this approach, which was proposed in 2023, increases estimates by nearly a factor of eight. According to the authors, this is because it gives more weight to the impact on poorer populations, who are more vulnerable to climate change. These new guidelines also lowered the discount rate, which results in an even greater SCC. While these policy changes reflect a shift towards considering the ethical implications of climate policies, Prest et al. argue that they have major implications for cost-benefit analyses. “Given its newfound importance,” say the authors, “the proper role of distributional weighting warrants careful and judicious dialogue in the scientific and policy communities.”