Thursday, May 22, 2025

 

Ancient remains reveal how a pathogen began to use lice – not ticks – to infect humans



Summary author: Walter Beckwith


American Association for the Advancement of Science (AAAS)




Most relapsing fever bacteria that infect humans are spread by ticks, but Borrelia recurrentis is unique in being transmitted between humans via body lice. Now, new genomic evidence from ancient British remains suggests that B. recurrentis diverged from its tick-borne relatives and began adapting to transmission by lice between 6000 and 4000 years ago – coinciding with the widespread use of wool textiles by humans. The findings underscore how ancient DNA can illuminate the origins and evolution of infectious diseases and how pathogens like B. recurrentis have been shaped by human social transformations. Several pathogenic bacterial species that once relied on ticks for transmission have independently evolved to use lice as vectors instead, including B. recurrentisB. recurrentis has no known animal reservoir. Moreover, this louse-adapted pathogen tends to show higher virulence compared to its tick-borne relatives, suggesting a process of specialization. However, the precise timeline and genetic mechanisms behind its adaptation to lice and its increased virulence in humans remain uncertain.

 

Using advanced ancient DNA techniques optimized for degraded genetic material, Pooja Swali and colleagues recovered and analyzed four ancient B. recurrentis genomes from human remains in Britain, dating from roughly 2300 to 600 years ago. Through phylogenetic and pan-genome analysis, Swali et al. estimate that B. recurrentis diverged from its closest relative, B. duttonii, approximately 4700 – 5600 years ago. This period coincided with shifts in human behavior during the Neolithic-Bronze Age transition, such as the rise of sedentary lifestyles, the advent of wool textiles, and densely populated settlements. According to the authors, these changes may have facilitated the adaptation of B. recurrentis to the human body louse and also promoted genome reduction and host specialization. Over time, B. recurrentis underwent substantial genome reductions, particularly in plasmid-encoded genes. These genetic changes were accompanied by gains and losses of surface proteins that help the bacterium evade the host immune system.

Ancient DNA used to map evolution of fever-causing bacteria





The Francis Crick Institute

Poulton site 

image: 

Aerial shot of the medieval chapel in Poulton, Cheshire. 

view more 

Credit: Steve Potvin





Researchers at the Francis Crick Institute and UCL have analysed ancient DNA from Borrelia recurrentis, a type of bacteria that causes relapsing fever, pinpointing when it evolved to spread through lice rather than ticks, and how it gained and lost genes in the process.

This transition may have coincided with changes in human lifestyles, like living closer together and the beginning of the wool trade.

Borrelia recurrentis bacteria cause relapsing fever, an illness with many recurring episodes of fever, which is typically found today in areas with poor sanitation or overcrowding, such as refugee camps. It is a distant cousin of the bacteria that today cause Lyme disease.

Historical records in Britain have referred to periods of a ‘sweating sickness’ or ‘epidemic fever’ which may have been caused by B. recurrentis, but limited data means the likely cause of these outbreaks remains unknown.

Only three known species of bacteria, including B. recurrentis, have transitioned from being carried primarily by ticks to lice, changing the potential severity of the disease. Until now it was unknown when B. recurrentis made the jump from ticks to lice and what impact this had on disease transmission and severity in humans. 

In research published today in Science, the scientists sequenced the whole genome from four samples of B. recurrentis. Ranging from 2,300 to 600 years ago, their samples include the oldest B. recurrentis genome to date1. These ancient samples were obtained from the skeletons of people who were infected hundreds of years ago. The DNA is a shadow of the bacteria that once circulated in their blood and has been captured in bones and teeth.

The individuals’ teeth contained traces of B. recurrentis DNA. Two samples had relatively high amounts of the pathogen, suggesting these individuals may have died from a severe, acute infection, or that the DNA was particularly well preserved.

Becoming adapted to the human louse

The researchers looked at differences in the ancient genomes and modern-day B. recurrentis to map how the bacteria has changed over time, finding that the species likely diverged from its nearest tick-borne cousin, B. duttonii, about 6,000 to 4,000 years ago.

They compared the B. recurrentis genomes with B. duttonii, finding that much of the genome was lost during the tick-to-louse transition but that new genes were also gained over time. These genetic changes affected the bacteria’s ability to hide from the immune system and also share DNA with neighbouring bacteria, suggesting B. recurrentis had specialised to survive within the human louse.

The perfect conditions

Based on these ancient and modern genomes, the divergence from the bacteria’s tick-borne ancestor happened during the transition from the Neolithic period to the Early Bronze Age. This was a time of change in human lifestyles, as people began to domesticate animals and live in more dense settlements. This may have helped B. recurrentis spread from person to person more easily.

The researchers also raise the possibility that the development of sheep farming for wool at this time may have given an advantage to louse-borne pathogens, as wool has better conditions for lice to lay eggs.

They conclude that the evolution of B. recurrentis highlights that a combination of genetic and environmental changes can help pathogens spread and infect populations more easily.

Pooja Swali, Research Fellow at UCL, former Crick PhD student and first author, said: “Louse-borne relapsing fever is a neglected disease with limited modern genomes, making it difficult to study its diversity. Adding four ancient B. recurrentis genomes to the mix has allowed us to create an evolutionary time series and shed light on how the genetics of the bacteria have changed over time. Although there’s a trend towards genome decay as it adapted to the human louse vector, we’ve shown that the evolution of B. recurrentis was dynamic until about 1,000 years ago, when it looks similar to present-day genomes.”

Pontus Skoglund, Group Leader of the Ancient Genomics Laboratory at the Crick, and co-senior author, said: “Ancient DNA can enhance our understanding of significant but understudied diseases like relapsing fever. Understanding how bacteria such as ​​B. recurrentis​ became more ​severe​​ in the past may help us understand how diseases could change in the future. The time points we’ve identified suggest that ​​changes in human societies​ such as new clothing material or living in larger groups​ may have allowed B. recurrentis to jump vectors and become more lethal, an example of how pathogens and humans have co-evolved.” 

Lucy van Dorp, Group Leader at UCL, and co-senior author, said: “Genetic analysis of these infections in ancient humans has allowed us to directly track how B. recurrentis has juggled loss and gain of genes during its evolution. Its ability to spread and cause disease appears to be context-dependent, with ancient DNA allowing us to speculate on the important role of past human interactions and behaviour in creating conditions conducive to disease spread. More samples will help us to narrow down the events which led to this tick-to-louse transition and the genetic mechanisms which have helped the bacteria thrive using either vector.”

-ENDS-

For further information, contact: press@crick.ac.uk or +44 (0)20 3796 5252

Notes to Editors

Reference: Swali, P. et al. (2025). Ancient Borrelia genomes document the evolutionary history of louse-borne relapsing fever. Science. 10.1126/science.adr2147.

The four Borrelia recurrentis genomes were sequenced from samples from different time points across England:

  1. A female skeleton in Wetwang Slack, an Iron Age barrow cemetery in East Yorkshire.
  2. A human jawbone in Fishmonger’s Swallet, an Iron Age cave in South Gloucestershire.
  3. A tooth from a cranium in an Augustinian cemetery in late medieval Canterbury.
  4. A tooth from an adult male buried in a medieval chapel in Poulton, Cheshire.

The Francis Crick Institute is a biomedical discovery institute with the mission of understanding the fundamental biology underlying health and disease. Its work helps improve our understanding of why disease develops which promotes discoveries into new ways to prevent, diagnose and treat disease.

An independent organisation, its founding partners are the Medical Research Council (MRC), Cancer Research UK, Wellcome, UCL (University College London), Imperial College London and King’s College London.

The Crick was formed in 2015, and in 2016 it moved into a brand new state-of-the-art building in central London which brings together 1500 scientists and support staff working collaboratively across disciplines, making it the biggest biomedical research facility under a single roof in Europe.

http://crick.ac.uk/

About UCL – London’s Global University

UCL is a diverse global community of world-class academics, students, industry links, external partners, and alumni. Our powerful collective of individuals and institutions work together to explore new possibilities.

Since 1826, we have championed independent thought by attracting and nurturing the world's best minds. Our community of more than 50,000 students from 150 countries and over 16,000 staff pursues academic excellence, breaks boundaries and makes a positive impact on real world problems.

The Times and Sunday Times University of the Year 2024, we are consistently ranked among the top 10 universities in the world and are one of only a handful of institutions rated as having the strongest academic reputation and the broadest research impact.

We have a progressive and integrated approach to our teaching and research – championing innovation, creativity and cross-disciplinary working. We teach our students how to think, not what to think, and see them as partners, collaborators and contributors.

For almost 200 years, we are proud to have opened higher education to students from a wide range of backgrounds and to change the way we create and share knowledge.

We were the first in England to welcome women to university education and that courageous attitude and disruptive spirit is still alive today. We are UCL.

www.ucl.ac.uk | Read news at www.ucl.ac.uk/news/ | Find out what’s on at UCL Minds

Richard Madgwick (left), Jack Randell (middle) and Jessica Peto (right) in the Bone Idle Chamber of Fishmongers Swallet.

Credit

Adelle Bricking

Map showing where the four Borrelia recurrentis genomes were sampled from and during which time period. 

Credit

Pooja Swali, adapted from Swali, P. (2025). Science

 

New standards in nuclear physics



Paul Scherrer Institute
Aldo Antognini 

image: 

PSI physicist Aldo Antognini is pleased that he and his team, within an international collaboration, have achieved yet another fundamental result in atomic physics.

view more 

Credit: © Scanderbeg Sauer Photography





New standards in nuclear physics

An international research team led by the Paul Scherrer Institute PSI has measured the radius of the nucleus of muonic helium-3 with unprecedented precision. The results are an important stress test for theories and future experiments in atomic physics.

1.97007 femtometre (quadrillionths of a metre): That’s how unimaginably tiny the radius of the atomic nucleus of helium-3 is. This is the result of an experiment at PSI that has now been published in the journal Science. More than 40 researchers from international institutes collaborated to develop and implement a method that enables measurements with unprecedented precision. This sets new standards for theories and further experiments in nuclear and atomic physics.

This demanding experiment is only possible with the help of PSI’s proton accelerator facility. There Aldo Antognini’s team generates so-called muonic helium-3, in which the two electrons of the helium atom are replaced by an elementary particle called a muon. This allows the nuclear radius to be determined with high precision. With the measurement of helium-3, the experiments on light muonic atoms have now been completed for the time being. The researchers had previously measured muonic helium-4 and, a few years ago, the atomic nucleus of muonic hydrogen and deuterium.

Muonic helium-3: Twice as slimmed-down

Helium-3 is the lighter cousin of ordinary helium, helium-4. Its atomic nucleus has two protons and two neutrons (hence the 4 after the abbreviation for the element); in helium-3, one of the neutrons is missing. The simplicity of this slimmed-down atomic nucleus is very interesting to Aldo Antognini and other physicists. The helium-3 that PSI physicist and ETH Zurich professor Antognini is using in the current experiment lacks not only a neutron in the nucleus, but also both electrons that orbit this nucleus. The physicists replace the electrons with a negatively charged muon – hence the name muonic helium-3. The muon is around 200 times heavier and gets close to the nucleus. Thus the nucleus and the muon «sense» each other much more intensely, and the wave functions overlap more strongly, as they say in physics. That makes the muon the perfect probe for measuring the nucleus and its charge radius. This indicates the area over which the positive charge of the nucleus is distributed. Ideal for the researchers: This charge radius of the nucleus does not change when the electrons are replaced by a muon.

Antognini has experience in measuring muonic atoms. A few years ago, he carried out the same experiment with muonic hydrogen, which contains only one proton in the nucleus and whose one electron was replaced by a negatively charged muon. The results caused quite a commotion at the time, because the deviation from measurements based on other methods was surprisingly large. Some critics even considered them wrong. It has now been confirmed many times over: The results were correct.

Worldwide-unique facility enables experiments

This time Antognini will not need to exercise as much persuasive power. For one thing, he has established himself as the leading expert in this area of research. Another factor is that there was no big surprise this time. The current results from muonic helium-3 fit well with those from previous experiments in which other methods were used. However, the PSI team’s measurements are around 15 times more precise.

Negatively charged muons, and plenty of them, are the most important ingredient for the experiment. These must have a very low energy – that is, they must be very slow, at least by the standards of particle physics. At PSI, around 500 muons per second with energies of one kiloelectron-volt can be generated. This makes the PSI proton accelerator facility, with its beamline developed in-house, the only one in the world that can deliver such slow negative muons in such large numbers.

Laser developed in-house was crucial for success

A crucial share of the success is due to a laser system that the researchers themselves developed. There the challenge is that the laser must fire immediately when a muon flies into the experimental setup. To make this possible, Antognini and his team install an extremely thin foil detector in front of the airless experimental chamber. This detects when a muon passes through the foil and signals the laser to emit a pulse of light immediately and at full power. The researchers determine the charge radius indirectly by measuring the frequency of the laser light. When the laser frequency precisely matches the resonance of a specific atomic transition, the muon is briefly excited to a higher energy state before decaying to the ground state within picoseconds; at that point it will emit a photon in the form of an X-ray. Finding the resonance frequency at which this transition occurs requires a lot of patience, but the reward is an extremely accurate value for the charge radius of the nucleus.

New benchmark for theoretical modelling

The charge radii obtained from muonic helium-3 and helium-4 serve as important reference values for modern ab initio theories — that is, physical models that calculate the properties of complex physical systems directly from the fundamental laws of physics, without resorting to experimental data. In the context of nuclear physics, these models offer detailed insights into the structure of light atomic nuclei and the forces between their building blocks, the protons and neutrons.

Precise knowledge of these nuclear radii is also crucial for comparisons with ongoing experiments on conventional helium ions with one electron and on neutral helium atoms with two electrons. Such comparisons provide stringent tests of quantum electrodynamics (QED) in few-body systems – the fundamental theory that describes how charged particles interact through the exchange of photons. They allow researchers to test the predictive power of our most fundamental understanding of atomic structure. These efforts could lead to new insights into QED in for bound systems—that is, in systems such as atoms, in which particles are not free but bound to each other by forces—or perhaps even to indications of physical effects outside beyond the Standard Model of Particle Physics.

Follow-up experiments are currently being conducted by research teams in Amsterdam, Garching, and China, as well as in Switzerland by the Molecular Physics and Spectroscopy group led by Frédéric Merkt at ETH Zurich.

Antognini also has additional ideas for future experiments aimed at testing the theories of atomic and nuclear physics with even greater precision. One idea is to measure hyperfine splitting in muonic atoms. This refers to energy transitions between split energy levels that reveal deeper details about effects in the atomic nucleus that involve spin and magnetism. An experiment with muonic hydrogen is currently being prepared, and an experiment with muonic helium is planned. “Many people who work in nuclear physics are very interested in it and are eagerly awaiting our results,” Antognini says. But the energy density of the laser must be increased significantly, which will require an enormous advance in laser technology. This development is currently under way at PSI and ETH Zurich.

Text: Bernd Müller

Why Europe’s fisheries management needs a rethink


GEOMAR researchers identify systemic weaknesses in EU fisheries management and are calling for quotas to be set independently of national interests


Helmholtz Centre for Ocean Research Kiel (GEOMAR)




As legally required by the European Union, sustainable fisheries may not extract more fish than can regrow each year. Yet, about 70 per cent of commercially targeted fish stocks in northern EU waters are either overfished, have shrunken population sizes or have collapsed entirely. So why does the EU continue to miss its sustainable fisheries targets, despite a wealth of scientific data and policy instruments? Researchers at GEOMAR Helmholtz Centre for Ocean Research Kiel and Kiel University examined this question using the well-explored seas of northern Europe as a case study, with a particular focus on the western Baltic Sea. Their analysis is published in Science today.

“We analysed the problems and concluded that they are driven by short-sighted national calls for higher, unsustainable catches, compromising all levels of decision making,” says lead author Dr Rainer Froese, a fisheries scientist at GEOMAR. “Environmental factors such as warming waters and oxygen loss also play a role, but overfishing is so strong that it alone suffices to collapse stocks.” He adds: “We propose a new approach to EU fisheries management that would overcome the problems, be doable within existing legislation, and lead to profitable fisheries from healthy fish stocks within a few years.”

The European path to setting annual quotas

The EU’s Common Fisheries Policy (CFP) is based on the United Nations Convention on the Law of the Sea (UNCLOS), which states that fish populations are to be maintained or restored to levels that can support maximum sustainable catches. In northern Europe, this is implemented through legally binding total allowable catches (TACs), which are advised scientifically by the International Council for the Exploration of the Sea (ICES), an intergovernmental organization with working groups consisting mostly of scientists from national fisheries institutions. Based on this advice, the European Commission proposes annual quotas, which are then discussed with member states and stakeholders. Ultimately, the Council of EU Fisheries Ministers decides on the legally binding total allowable catch for the following year. Unfortunately, this process often results in quotas that were increased at every step – with harmful consequences for fish stocks.

Mismanagement in the western Baltic Sea

The western Baltic Sea is a window into the dynamics between fish and fisheries – a relatively simple ecosystem for which extensive data are available, and which is fished solely under EU control.

“The western Baltic is dominated by three commercially important species: cod, herring and plaice,” explains Prof. Dr Thorsten Reusch, Head of the Marine Ecology Research Division at GEOMAR. “Long-standing overfishing of cod and herring has led to the recent collapse of these fisheries, whereas flatfish such as plaice, flounder and dab – which are less demanded and fished less intensively – have shown stable or even increasing stock sizes.” In 2022, overall, less than a tenth of what could have been sustainably caught from healthy stocks was actually landed. Reusch continues: “It’s the small-scale coastal fishers who are suffering the most, often without having done anything wrong, except perhaps relying on fishing associations that lobbied for unsustainable quotas.”

Systematic overestimation and phantom recoveries

In order to manage catches sustainably, the International Council for the Exploration of the Sea (ICES) advises on how much fish of a given species of fish can be extracted annually without threatening the long-term viability of the stock.

However, ICES’s assessments repeatedly overpredicted stock sizes for the upcoming year for which sustainable catches were to be advised. These overly optimistic projections suggested that fish stocks were recovering and could support much higher catches, when, in reality, the stocks were stagnating or declining. “We’re talking about ‘phantom recoveries’,” says Froese, “recoveries that were predicted but never happened.”

The overfishing ratchet: when the system undermines its own goals

Building on the already too high ICES advice, the European Commission often proposed even higher catch limits, which the ministers in the EU Council usually approved, or sometimes increased further. As a result, official quotas permitted the capture of far more fish than the stocks could replenish. In some years even more than there were fish in the water. The authors call this process the ‘overfishing ratchet’: like a mechanical ratchet, it only turns in one direction. This process strongly favours higher catches at every step, leading to total allowable catches (TACs) that often exceed what fishers are able to catch.

As Froese notes: “Interestingly, actual catches often remained below these inflated quotas – simply because fishers stopped fishing when the cost of chasing the last fish exceeded the value of the catch.”

A new independent authority for ecosystem-based catch advice

The Common Fisheries Policy included an explicit deadline of 2020 to end overfishing – a goal that was clearly missed, as Thorsten Reusch points out. “Europe must play a leading role by making its own fisheries sustainable if it hopes to encourage other regions of the world to adopt sustainable fishing practices.” His appeal: “The EU must take its sustainability goals seriously and implement the CFP according to its stated objectives, urgently.”

To make the process more transparent and ensure accountability, the researchers propose creating a new politically independent institution with a clear mandate to provide robust scientific estimates of the highest sustainable annual catch for every stock, in line with ecosystem-based fisheries management (EBFM) principles. This would enable the EU to finally implement its own laws and effectively end overfishing.

Froese concludes: “To be successful, such an institution would need to operate with the same level of independence as a central bank.” He reiterates: “Implementing sound scientific advice can lead to highly profitable fisheries from large fish stocks in healthy European seas in many cases, and within a few years.”

 

Fully protected marine areas in Brazil are contaminated by microplastics



Researchers from the Federal University of São Paulo used oysters and mussels as sentinel organisms to assess the presence of these pollutants. The results show that even the most restrictive sites for human presence have significant contamination.




Fundação de Amparo à Pesquisa do Estado de São Paulo

Fully protected marine areas in Brazil are contaminated by microplastics 

image: 

Abrolhos, one of the integral protection areas focused on in the study

view more 

Credit: Beatriz Zachello Nunes





Despite being considered sanctuaries for biodiversity, Brazil’s marine protected areas (MPAs) are not immune to microplastic contamination. A recent study has shown that even MPAs classified as integral protection areas (APIs), which are the most restrictive to human intervention, are contaminated by this material. The research, which involved Brazilian and Australian scientists, used bivalve mollusks (oysters and mussels) as sentinel organisms to assess contamination. The results were published in the journal Environmental Research.

“Our study showed that microplastic contamination occurs even in the most restrictive environmental protection areas. For example, in Atol das Rocas, where there’s no economic activity and tourists aren’t allowed to visit. Microplastics can reach places like this by being carried by the wind or ocean currents,” Ítalo Braga, coordinator of the research funded by FAPESP and professor at the Institute of Marine Science of the Federal University of São Paulo (IMar-UNIFESP) in Brazil, told Agência FAPESP

Microplastics are particles ranging in size from 1 micron (1 μm) to 5 millimeters (5 mm) that result from the fragmentation of larger plastics or are directly manufactured in this format for industrial or cosmetic use. Those detected in the study showed consistent patterns along the Brazilian coast: predominantly black, white or transparent, and less than 1 millimeter in size.

The chemical analysis identified 59.4% of them, the main components being: alkyd polymers (28.1%), used in paints and varnishes, possibly from boats and tourist vessels; cellulose (21%), which may be of natural origin (plankton, algae, marine plants and terrestrial vegetation) or of anthropogenic origin (paper, cardboard, food waste, etc.); polyethylene terephthalate (PET) (14%), commonly found in plastic packaging and synthetic fibers, released in laundry and carried to the sea by urban runoff; and polytetrafluoroethylene (PTFE or Teflon) (12.3%), present in non-stick and industrial coatings. The remaining 40.6% could not be described.

“Along the Brazilian coast, there are several protected areas with different levels of management. National parks, such as Abrolhos and Fernando de Noronha, are highly protected, while others, such as some APAs [environmental protection areas], allow some degree of human intervention. Our study focused on integral protection areas, called ‘no-takes’ in the specialized international literature, which are more restrictive marine protected areas. We selected ten of them: Jericoacoara National Park, Atol das Rocas, Fernando de Noronha, Rio dos Frades, Abrolhos, Tamoios, Alcatrazes, Guaraqueçaba, Carijós and Arvoredo,” says Braga.

Global measurements

The research, conducted by doctoral student Beatriz Zachello Nunes, showed that microplastics are present in all of these APIs, with an average concentration of 0.42 ± 0.34 particles per gram of wet tissue. Among the areas studied, the highest contamination was recorded in the Alcatrazes Archipelago Wildlife Refuge, with 0.90 ± 0.59 particles per gram, while the lowest concentration was found in the Atol das Rocas Biological Reserve, with 0.23 particles per gram.

“The positive thing is that pollution in all these areas is below the international average for marine protected areas [see figure below]. And well below the Brazilian average for non-protected areas. Heavily contaminated areas, such as Santos and some beaches in Rio de Janeiro, are 50 to 60 times more polluted. In fact, Santos has one of the highest concentrations of microplastics in the world,” says the researcher.

Bivalve mollusks (oysters, clams, mussels and others), which get their name from having a shell divided into two parts, or two articulated valves, were chosen for the study because they are considered sentinels of the sea. “They feed by filtering seawater. The food in the water is retained in their gills, which act as sieves. And tiny cilia carry it to their stomachs. If that water contains contaminants, such as microplastics, the bivalves will retain them as well. So instead of taking water samples, which vary all the time, we analyze the bivalves because they accumulate pollutants over time and provide a more reliable history of contamination,” Braga explains.

The results of the study show that plastic pollution is present even in the most restrictive environmental protection areas, with potential risks to marine ecosystems and food chains. “The creation of MPAs alone isn’t enough to stop pollution. It’s essential that these areas have efficient environmental management and strict enforcement. But even this isn’t enough if we consider that the microplastics may not be generated locally, but brought in from afar by the atmosphere and ocean currents. To mitigate this, only global measures, such as the Global Plastics Treaty currently being negotiated and developed under the coordination of the United Nations Environment Program [UNEP], can make a difference,” concludes the researcher.

About FAPESP

The São Paulo Research Foundation (FAPESP) is a public institution with the mission of supporting scientific research in all fields of knowledge by awarding scholarships, fellowships and grants to investigators linked with higher education and research institutions in the state of São Paulo, Brazil. FAPESP is aware that the very best research can only be done by working with the best researchers internationally. Therefore, it has established partnerships with funding agencies, higher education, private companies, and research organizations in other countries known for the quality of their research and has been encouraging scientists funded by its grants to further develop their international collaboration.