Thursday, September 25, 2025

 

Different color-changing strategies better protect prey, depending on conditions




Summary author: Walter Beckwith



American Association for the Advancement of Science (AAAS)






A global experiment looking at how birds respond to 15,000 paper “moths” reveals that no color-changing strategy to deter predators is universally effective; both camouflage and warning coloration succeed under different ecological conditions, the study shows. Predation is a powerful force shaping evolution, driving the development of two major antipredator color strategies: camouflage, which helps prey to blend into their surroundings to avoid detection, and aposematism, in which prey advertise genuine defenses or, in the case of mimics, deceptive protection, using bright and conspicuous warning colors. Both strategies can be effective under different ecological conditions, yet the environmental factors that favor the development of one adaptation over the other are not well understood, nor have they been evaluated together or across large scales.

 

To address these questions, Iliana Medina and colleagues conducted a large-scale, global field experiment to test how birds respond to different antipredator color strategies. Medina et al. placed 15,018 paper moth models – some camouflaged brown, some with typical orange-black warning colors, and some with unusual turquoise-black warning patterns – across 21 forests on 6 continents. Each model included a mealworm to attract predators, and consumption was monitored to assess successful predation. According to the findings, no strategy was universally superior; instead, the protective value of each type depended on ecological context. The authors found that camouflage was highly context-dependent, offering early protection under low light or high predator competition. However, camouflage lost effectiveness as predators adapted or cryptic prey became common. In contrast, warning coloration was generally more reliable, though its success was shaped by predator pressure, the frequency and similarity of warning-colored prey in the community, and other ecological factors, especially in lower-latitude environments. Medina et al. show that predator competition is the most influential factor for both strategies and suggest that camouflage may be a less stable defense strategy, one more susceptible to environmental and human-driven changes. This instability could explain why camouflage is gained and lost more frequently than warning coloration over evolutionary time.

How research support has helped create life-changing medicines

A new study finds over half the drugs approved this century cite federally funded research in their patents.



Massachusetts Institute of Technology





Gleevec, a cancer drug first approved for sale in 2001, has dramatically changed the lives of people with chronic myeloid leukemia. This form of cancer was once regarded as very difficult to combat, but survival rates of patients who respond to Gleevec now resemble that of the population at large. 

Gleevec is also a medicine developed with the help of federally funded research. That support helped scientists better understand how to create drugs targeting the BCR-ABL oncoprotein, the cancer-causing protein behind chronic myeloid leukemia.  

A new study co-authored by MIT researchers quantifies how many such examples of drug development exist. The current administration is proposing a 40 percent budget reduction to the National Institutes of Health (NIH), which sponsors a significant portion of biomedical research. The study finds that over 50 percent of small-molecule drug patents this century cite at least one piece of NIH-backed research that would likely be vulnerable to that potential level of funding change.  

“What we found was quite striking,” says MIT economist Danielle Li, co-author of a newly published paper outlining the study’s results. “More than half of the drugs approved by the FDA since 2000 are connected to NIH research that would likely have been cut under a 40 percent budget reduction.”

Or, as the researchers write in the paper: “We found extensive connections between medical advances and research that was funded by grants that would have been cut if the NIH budget was sharply reduced.”

The paper, “What if NIH funding had been 40% smaller?” will be published as a Policy Article in the journal Science. The authors are Pierre Azoulay, the China Program Professor of International Management at the MIT Sloan School of Management; Matthew Clancy, an economist with the group Open Philanthropy; Li, the David Sarnoff Professor of Management of Technology at MIT Sloan; and Bhaven N. Sampat, an economist at Johns Hopkins University. (Biomedical researchers at both MIT and Johns Hopkins could be affected by adjustments to NIH funding.)

To conduct the study, the researchers leveraged the fact that the NIH uses priority lists to determine which projects get funded. That makes it possible to discern which projects were in the lower 40 percent of NIH-backed projects, priority-wise, for a given time period. The researchers call these “at-risk” pieces of research. Applying this data from 1980 through 2007, the scholars examined the patents of the New Molecular Entities — drugs with a new active ingredient — approved by the Food and Drug Administration since 2000. There is typically a time interval between academic research and subsequent related drug development. 

The study focuses on small-molecule drugs — compact organic compounds, often taken orally as medicine — whereas NIH funding supports a wider range of advancements in medicine generally. Based on how many of these FDA-approved small-molecule medicines were linked to at-risk research from the prior period, the researchers estimated what kinds of consequences a 40 percent cut in funding would have generated going forward. 

The study distinguishes between two types of links new drugs have to NIH funding. Some drug patents have what the researchers call “direct” links to new NIH-backed projects that generated new findings relevant to development of those particular drugs. Other patents have “indirect “ links to the NIH, when they cite prior NIH-funded studies that contributed to the overall body of knowledge used in drug development. 

The analysis finds that 40 of the FDA-approved medications have direct links to new NIH-supported studies cited in the patents — or 7.1 percent. Of these, 14 patents cite at-risk pieces of NIH research. 

When it comes to indirect links, of the 557 drugs approved by the FDA from 2000 to 2023, the study found that 59.4 percent have a patent citing at least one NIH-supported research publication. And, 51.4 percent cite at least one NIH-funded study from the at-risk category of projects. 

“The indirect connection is where we see the real breadth of NIH's impact,” Li says. “What the NIH does is fund research that forms the scientific foundation upon which companies and other drug developers build.”

As the researchers emphasize in the paper, there are many nuances involved in the study. A single citation of an NIH-funded study could appear in a patent for a variety of reasons, and does not necessarily mean “that the drug in question could never have been developed in its absence,” as they write in the paper. To reckon with this, the study also analyzes how many patents had at least 25 percent of their citations fall in the category of at-risk NIH-backed research. By this metric, they found that 65 of the 557 FDA-approved drugs, or 11.7 percent, met the threshold. 

On the other hand, as the researchers state in the paper, it is possible the study “understates the extent to which medical advances are connected to NIH research.” For one thing, as the study’s endpoint for examining NIH data is 2007, there could have been more recent pieces of research informing medications that have already received FDA approval. The study does not quantify “second-order connections,” in which NIH-supported findings may have led to additional research that directly led to drug development. Again, NIH funding also supports a broad range of studies beyond the type examined in the current paper. 

It is also likely, the scholars suggest, that NIH cuts would curtail the careers of many promising scientists, and in so doing slowdown medical progress. For a variety of these reasons, in addition to the core data itself, the scholars say the study indicates how broadly NIH-backed research has helped advance medicine. 

“The worry is that these kinds of deep cuts to the NIH risk that foundation and therefore endanger the development of medicines that might be used to treat us, or our kids and grandkids, 20 years from now,” Li says. 

Azoulay and Sampat have received past NIH funding. They also serve on an NIH working group about the empirical analysis of the scientific enterprise.

###

Written by Peter Dizikes, MIT News

 

Carbon cycle flaw can plunge Earth into an ice age



Study: how Earth may overcorrect for warming



University of California - Riverside

Climate overcorrection 

image: 

Computer simulation of Earth's climate evolving over one million years in response to a sudden release of carbon dioxide to the atmosphere. 

view more 

Credit: Andy Ridgwell/UCR






UC Riverside researchers have discovered a piece that was missing in previous descriptions of the way Earth recycles its carbon. As a result, they believe that global warming can overcorrect into an ice age.  

The traditional view among researchers is that Earth’s climate is kept in check by a slow-moving but reliable natural system of rock weathering. 

In this system, rain captures carbon dioxide from the air, hits exposed rocks on land – especially silicate rocks like granite – and slowly dissolves them.  When this captured CO2 reaches the ocean together with dissolved calcium from the rocks, they combine to form seashells and limestone reefs, locking the carbon away on the sea floor for many hundreds of millions of years. 

“As the planet gets hotter, rocks weather faster and take up more CO₂, cooling the planet back down again,” said Andy Ridgwell, UCR geologist and co-author of the paper published today in Science

However, geological evidence suggests that ice ages earlier in Earth’s life were so extreme that the entire surface of the planet was covered in snow and ice. Therefore, the researchers say, a gentle regulation of planetary temperature cannot be the whole story.

The missing piece also involves carbon burial in the ocean. As CO₂ increases in the atmosphere and the planet warms, more nutrients like phosphorus get washed into the sea. These nutrients fuel the growth of plankton that take in carbon dioxide when they photosynthesize. Then when they die, they sink to the sea floor, taking that carbon with them. 

However, in a warmer world with more algal activity, oceans lose oxygen, causing phosphorus to get recycled instead of buried. This creates a feedback loop where more nutrients in the water create more plankton, whose decay removes even more oxygen, and more nutrients get recycled. At the same time, massive amounts of carbon are buried, and the Earth cools. 

This system doesn’t gently stabilize the climate, but instead overshoots, cooling Earth far below its starting temperature. In the study’s computer model, this could trigger an ice age.

Ridgwell compares all this to a thermostat working overtime to cool a house. 

“In summer, you set your thermostat around 78F. As the air temperature climbs outside during the day, the air conditioning removes the excess heat inside until the room temperature comes down to 78 and then it stops,” Ridgwell said. 

Using his analogy, Earth’s thermostat isn’t broken, but Ridgwell suggests it might not be in the same room as the air conditioning unit, making performance uneven. 

In the study, lower atmospheric oxygen in the geological past made the thermostat much more erratic, hence ancient extreme ice ages.

As humans add more CO₂ to the atmosphere today, the planet will continue to warm in the short term. The authors’ model predicts a cooling overshoot will occur.  However, the next one will likely be milder because there is more oxygen in the atmosphere now than in the distant past, which dampens the nutrient feedback. 

“Like placing the thermostat closer to the AC unit,” Ridgwell added. Still, it could be enough to bring forward the start of the next ice age. 

“At the end of the day, does it matter much if the start of the next ice age is 50, 100, or 200 thousand years into the future?” Ridgwell wondered. “We need to focus now on limiting ongoing warming. That the Earth will eventually cool back down, in however wobbly a way, is not going to happen fast enough to help us out in this lifetime.”

 

 

Capturing 100 years of antibiotic resistance evolution




Wellcome Trust Sanger Institute




The genetic culprits responsible for the spread of multidrug resistance (MDR) in bacteria have been identified by new research mapping 100 years of bacterial evolution.

Experts at the Wellcome Sanger Institute, the University of Bath, the UK Health Security Agency (UKHSA) and their collaborators1, analysed over 40,000 plasmids from historical and present-day bacterial samples taken across six continents, the largest dataset of its kind.

Plasmids are transferable structures in bacteria that allow different strains to share genetic information. In this study, published today (25 September) in Science, researchers found that a minority of plasmids causes most of the multidrug resistance in the world. In the future, developing ways to target these could lead to new therapies to combat treatment-resistant infections worldwide.

Currently, treatment-resistant infections cause at least one million deaths worldwide every year, with this number expected to rise2. While some bacteria and fungi carry antimicrobial resistance (AMR) genes naturally, the emergence and spread of MDR and AMR genes has been consistently linked to the use of antibiotics. 

In this study, researchers were able to analyse bacterial samples from as far back as 1917, a time before the discovery of antibiotics. The team found that the ancestral plasmids that went on to become global spreaders of AMR genes did not contain resistance genes initially. They evolved to gain this resistance as the use of antibiotics grew. Their descendants, a relatively small group of modern plasmids, now confer resistance to both first-line and last-resort antibiotics, making them a huge threat to human health.

By diving into the last 100 years of bacterial history, the team also developed a model for plasmid evolution, highlighting the three distinct pathways plasmids can take.

The modern plasmids that contain MDR genes rose from two of these pathways. Either they were formed by AMR genes being inserted into an existing plasmid structure, or these plasmids were the result of fusion with another plasmid. The plasmids formed by fusion are also highly transferable between different species of bacteria. 

As these plasmids are found across multiple different bacterial species, developing ways to target the AMR-carrying plasmids could lead to new therapies to combat multiple treatment-resistant infections worldwide.

Additionally, the model developed to explain the last 100 years of evolution could help predict how the next 100 years may go. This could help predict outbreaks and patterns of infectious disease and be used to inform public health strategies to stop the spread of infectious disease.

Dr Adrian Cazares, first author at the Wellcome Sanger Institute, said: “Plasmids are key drivers of bacterial survival against antibiotics. By going back in time through unique historical collections, we reveal how plasmids adapted to the antibiotic era and the evolutionary journey that transformed a minority of them into the global multidrug-resistant vectors driving the antimicrobial resistance crisis we face today. Our findings show that the widespread use of antibiotics fundamentally reshaped the genetic landscape of plasmids, and the way bacteria fight against antimicrobial treatment, and are a stark reminder that our actions have a profound and lasting impact on bacterial evolution.”

Professor Zamin Iqbal, co-senior author at the Milner Centre for Evolution, University of Bath, said: “Our research combines historical and modern-day data to give a new perspective on the evolutionary lifestyles that different plasmids can have, whether this be changing slowly, completely merging with other plasmids, or disappearing and leaving genetic fragments that are then 'recycled for parts’. Through our research, we can see the impact of human antibiotic use and the shadow of other selective forces, which also challenge bacteria and their plasmids.”

Dr Sarah Alexander, UKHSA lead for the study, said: “This groundbreaking study builds on a longstanding and successful collaboration between The National Collection of Type Cultures (NCTC) and the Wellcome Sanger Institute. The generous donation of the Murray Collection to the NCTC enabled us to conserve these irreplaceable bacterial strains. DNA analysis of our strains has shown how NCTC’s stringent preservation methods have ensured the bacteria have not changed since they were deposited, even over many decades. These exciting findings help us to better understand the origins of antimicrobial resistance and could play a role in helping us tackle the global spread of resistant infections over years to come.”

Professor Nick Thomson, co-senior author at the Wellcome Sanger Institute, said: “This research builds on the rich and leading role the UK has played in the discovery and development of antibiotics for the treatment of human and animal infections. Here, we combined expertise across institutes, teaming up with the longest-running publicly owned biological archive in the world. Thanks to this, we were able to use unique historical samples to glimpse at the molecular events behind one of the most significant challenges facing us today — untreatable infections. By uncovering the rules of plasmid evolution, it could be possible, in the future, to develop rational approaches to target the spread of antibiotic resistance genes, the consequences of which touch all our lives at one point or another.”

ENDS

Contact details:

Rachael Smith

Press Office

Wellcome Sanger Institute

Cambridge, CB10 1SA

07827979492

Email: press.office@sanger.ac.uk 

Notes to Editors:

  1. This was a highly collaborative project including experts at the Wellcome Sanger Institute, EMBL’s European Bioinformatics Institute (EMBL-EBI), the University of Cambridge, the University of Oxford, the UK Health Security Agency, the University of Bath, and the London School of Health and Tropical Medicine. A full author and affiliations list can be found in the publication.
  2. Naghavi, Mohsen et al. (2024) ‘Global burden of bacterial antimicrobial resistance 1990–2021: a systematic analysis with forecasts to 2050’. The Lancet. Available at: https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(24)01867-1/fulltext 

Publication:

A. Cazares, W. Figueroa, D. Cazares, et al. (2025) ‘Pre and Post antibiotic epoch: the historical spread of antimicrobial resistance’. Science. DOI: 10.1126/science.adr1522

Funding:

This research was part-funded by Wellcome. A full acknowledgement list can be found in the publication.

Selected websites:

About the University of Bath

The University of Bath is one of the UK's leading universities, recognised for high-impact research, excellence in education, an outstanding student experience and strong graduate prospects.

  • We are ranked among the top 10% of universities globally, placing 132nd in the QS World University Rankings 2026.
  • We are ranked in the top 10 in all of the UK’s major university guides.
  • The University achieved a triple Gold award in the last Teaching Excellence Framework 2023, the highest awards possible, for both the overall assessment and for student outcomes and student experience. The Teaching Excellence Framework (TEF) is a national scheme run by the Office for Students (OfS).
  • We are The Times and The Sunday Times Sport University of the Year 2026

Research from Bath is helping to change the world for the better. Across the University’s three Faculties and School of Management, our research is making an impact in society, leading to low-carbon living, positive digital futures, and improved health and wellbeing. Find out all about our Research with Impact: https://www.bath.ac.uk/campaigns/research-with-impact/ [74n5c4m7.r.eu-west-1.awstrack.me]

The Wellcome Sanger Institute

The Wellcome Sanger Institute is a world leader in genomics research. We apply and explore genomic technologies at scale to advance understanding of biology and improve health. Making discoveries not easily made elsewhere, our research delivers insights across health, disease, evolution and pathogen biology. We are open and collaborative; our data, results, tools, technologies and training are freely shared across the globe to advance science.

Funded by Wellcome, we have the freedom to think long-term and push the boundaries of genomics. We take on the challenges of applying our research to the real world, where we aim to bring benefit to people and society.

Find out more at www.sanger.ac.uk or follow us on Twitter, Instagram, FacebookLinkedIn and on our Blog.

About Wellcome

Wellcome supports science to solve the urgent health challenges facing everyone. We support discovery research into life, health and wellbeing, and we’re taking on three worldwide health challenges: mental health, infectious disease and climate and health. https://wellcome.org/ 

 

Proven quantum advantage: Researchers cut the time for a learning task from 20 million years to 15 minutes


Amid high expectations for quantum technology, a new paper in Science reports proven quantum advantage. In an experiment, entangled light lets researchers learn a system's noise with very few measurements




Technical University of Denmark

Squeezer 1 

image: 

The squeezer - an optical parametric oscillator (OPO) that uses a nonlinear crystal inside an optical cavity to manipulate the quantum fluctuations of light. - is responsible for the entanglement. Photo: Jonas Schou Neergaard-Nielsen.

view more 

Credit: Photo: Jonas Schou Neergaard-Nielsen.





Amid high expectations for quantum technology, a new paper in Science reports proven quantum advantage. In an experiment, entangled light lets researchers learn a system's noise with very few measurements.

Researchers at the Technical University of Denmark (DTU) and international partners have demonstrated that entangled light can cut the number of measurements needed to learn the behaviour of a complex, noisy quantum system by an enormous factor.

"This is the first proven quantum advantage for a photonic system," says corresponding author Ulrik Lund Andersen, a professor at DTU Physics.

"Knowing that such an advantage is possible with a straightforward optical setup should help others look for areas where this approach would pay off, such as sensing and machine learning."

The work appears in Science under the title 'Quantum learning advantage on a scalable photonic platform'It is carried out in collaboration with colleagues from the US, Canada, and South Korea.

Entanglement is key

At the heart of the study is a problem that shows up across science and engineering: When you want to understand or characterise a physical system, such as a device, you do repeated measurements and, based on those, for instance, work out the "noise fingerprint" of the device.

In quantum devices, however, it is not as straightforward. For one, quantum noise is part of the measurements. Also, the number of experiments required for complex systems can scale exponentially with the system's size, so it quickly becomes impractical or even impossible. The researchers set out to find another way using entangled light.

Entanglement is a key concept in quantum mechanics where two particles or light beams are so strongly linked that measuring one instantly tells you something about the other.

"We built a process we could control and asked a simple question: Does entanglement reduce the number of measurements you need to learn such a system? And the answer is yes, by a lot. We learned the behaviour of our system in 15 minutes, while a comparable classical approach would take around 20 million years," says Ulrik Lund Andersen.

Something no classical system can do

After laying the theoretical groundwork in the 2024 paper 'Entanglement-Enabled Advantage for Learning a Bosonic Random Displacement Channel', the researchers knew that entangled light would likely solve the issue.

The experiment was set up in the basement at DTU Physics and runs at telecom wavelengths with well-known optical parts. It works even with ordinary losses in the setup. That matters, the researchers say, because it shows that the gain comes from how you measure, not a perfect measuring device.

In more detail, the system consisted of an optical channel in which multiple light pulses shared the same noise pattern. Two beams of light were prepared – or more precisely, squeezed - so they became entangled. One beam is used to probe the system; the other is there for reference. A joint measurement compares them in one shot, and that comparison cancels much of the measurement fuzz and pulls out more information per trial than looking at the probe alone.

Jonas Schou Neergaard‑Nielsen, an associate professor at DTU Physics and co-author of the paper, stresses that the researchers have not targeted a concrete real-world system yet:

"Even though a lot of people are talking about quantum technology and how they outperform classical computers, the fact remains that today, they don't. So, what satisfies us is primarily that we have finally found a quantum mechanical system that does something no classical system will ever be able to do."

Fact box

The paper 'Quantum learning advantage on a scalable photonic platform' is published in Science on 25 September 2025.

The research project was led by DTU's bigQ centre, headed by Ulrik Lund Andersen, with Jonas Schou Neergaard‑Nielsen as a co-PI.

The paper's lead authors are postdoc Zhenghao Liu and Ph.D. student Romain Brunel, who are also from DTU's bigQ centre and DTU Physics.

Apart from DTU, the partners behind the paper are researchers from the University of Chicago, Perimeter Institute, University of Waterloo, Caltech, MIT, and KAIST.

The squeezer - an optical parametric oscillator (OPO) that uses a nonlinear crystal inside an optical cavity to manipulate the quantum fluctuations of light. - is responsible for the entanglement. Photo: Jonas Schou Neergaard-Nielsen.

The squeezer - an optical parametric oscillator (OPO) that uses a nonlinear crystal inside an optical cavity to manipulate the quantum fluctuations of light. - is responsible for the entanglement. Photo: Jonas Schou Neergaard-Nielsen.