Saturday, April 27, 2024

 

Making transfusion-transmitted malaria in Europe a thing of the past



The malaria parasite is found in red blood cells and can be transmitted through blood, organ and tissue donations, or the shared use of needles or syringes.



EUROPEAN SOCIETY OF CLINICAL MICROBIOLOGY AND INFECTIOUS DISEASES




Individuals requiring blood transfusion are a vulnerable population, often with debilitating conditions like cancer, and transfusion-transmitted malaria is often fatal.

Current serological tests used to identify “at risk” donors are not sensitive enough to completely eliminate malaria transfusion risk.

The current challenges are to optimise the testing strategy to minimise the loss of blood products (especially those rare blood phenotypes) and to enhance screening sensitivity for infectious immuno-tolerant donors who are difficult to identify.

**ECCMID has now changed its name to ESCMID Global, please credit ESCMID Global Congress (formerly ECCMID, Barcelona, Spain, 27-30 April) in all future stories**

The current strategy used in Europe to mitigate malaria transfusion risk is efficient with just 10 reported cases over the 20 past years. However, current serological tests used to identify “at risk” donors are not sensitive enough to completely eliminate the risk. In a presentation to be given at this year’s ESCMID Global Congress (formerly ECCMID) in Barcelona, Spain (27-30 April), Dr Sophie Le Cam from the French transfusion blood service (Etablissement Français du Sang [EFS]), will discuss the ongoing efforts being made to prevent transfusion-associated malaria in Europe and outline the different malaria screening strategies that need to be combined in order to ensure the safety of blood transfusions going forward.

Transfusion-transmitted malaria (TTM) is an accidental Plasmodium infection caused by whole blood or a blood component transfusion from a malaria infected donor to a recipient. Just a few parasites in a unit of blood are enough to cause infection, and all Plasmodium species are able to survive in stored blood, even if frozen, and retain their viability for at least a week. Ten cases of TTM have been reported in Europe over the last 20 years, in France, Spain, the UK, Italy, and the Netherlands.

Dr Le Cam highlights an important difference between natural malaria infection and TTM: “When individuals are infected naturally with malaria from a mosquito bite they undergo an initial asymptomatic phase which allows immunity cells to be activated against malaria parasites. But infected blood transfusions directly release malaria parasites into the bloodstream triggering high risk complications that can potentially lead to a fatal outcome, especially in non-endemic countries where the majority of individuals have never been exposed to malaria, and in immuno-compromised patients such as those with cancer and the elderly.”

Europe has a mandatory directive to prevent TTM, which recommend that people who have recently travelled to a country with malaria, or former residents of areas where malaria is present, cannot donate blood for at least 6 months and 3 years after they return, respectively.

In many countries, this deferral period can be reduced to 4 months if a negative malaria test is shown before each donation. However, existing microscopy and serological tests used to mandate these rules are not sensitive enough to reliably mitigate malaria transfusion risk. Microscopic examination is the “gold standard” for diagnosis of malaria illness, but is not adapted to blood bank activities in Europe. Serologic tests are widely used, but their sensitivity and specificity are not as good as expected.

According to Dr Le Cam, “The risk that is most difficult to mitigate comes from donors born, or who lived their early childhood, in malaria-endemic countries who can develop an immune-tolerancea host response that protects against high numbers of parasite and illness without eliminating the infection.” In these infectious immune-tolerant individuals, cases of TTM have been linked to blood donations given more than 5 years after the last potential exposure of the donor to P. falciparum, and several decades in the case of P. malariae.

As Dr Le Cam explains, “These asymptomatic infections characterised by low parasite densities require more sensitive methods for detection such as recent molecular methods. But while a fully automated molecular method may be the ideal screening method for malaria infection in the blood donor population, it is an expensive option.”

Le Cam sees the current challenge to optimising the testing strategy in non-endemic countries as enhancing screening sensitivity for immuno-tolerant donors without compromising the availability of blood products. “On one hand, the limited number of potentially infected donors requires a cost-effective strategy of blood donor screening, but on the other, the accuracy of screening needs to be optimal for the serious outcomes of TTM in malaria naïve and immuno-compromised recipients”, she says.

What needs to be done to make TMM a thing of the past in Europe? According to Dr Le Cam, “The key to transfusion security remains the deferral period after the return of donors from endemic countries. But we really need to develop new testing strategies. The parasite inactivation of blood using new technologies that are able to selectively inactivate pathogens without damaging cells or plasma could also be a good option, but the technology is not completely reliable for packed red blood cells and is very expensive. Ultimately, different strategies need to be combined in order to ensure the safety of blood transfusions in Europe that include blood donor screening by appropriate diagnostic tools, which should probably include molecular tests.”

 

 

Vitamin D alters mouse gut bacteria to give better cancer immunity


THE FRANCIS CRICK INSTITUTE




Researchers at the Francis Crick Institute, the National Cancer Institute (NCI) of the U.S. National Institutes of Health (NIH) and Aalborg University in Denmark, have found that vitamin D encourages the growth of a type of gut bacteria in mice which improves immunity to cancer.

Reported today in Science, the researchers found that mice given a diet rich in vitamin D had better immune resistance to experimentally transplanted cancers and improved responses to immunotherapy treatment. This effect was also seen when gene editing was used to remove a protein that binds to vitamin D in the blood and keeps it away from tissues.

Surprisingly, the team found that vitamin D acts on epithelial cells in the intestine, which in turn increase the amount of a bacteria called Bacteroides fragilis. This microbe gave mice better immunity to cancer as the transplanted tumours didn’t grow as much, but the researchers are not yet sure how.

To test if the bacteria alone could give better cancer immunity, mice on a normal diet were given Bacteroides fragilis. These mice were also better able to resist tumour growth but not when the mice were placed on a vitamin D-deficient diet.

Previous studies have proposed a link between vitamin D deficiency and cancer risk in humans, although the evidence hasn’t been conclusive.

To investigate this, the researchers analysed a dataset from 1.5 million people in Denmark1, which highlighted a link between lower vitamin D levels and a higher risk of cancer. A separate analysis of a cancer patient population also suggested that people with higher vitamin D levels2 were more likely to respond well to immune-based cancer treatments.

Although Bacteroides fragilis is also found in the microbiome in humans, more research is needed to understand whether vitamin D helps provide some immune resistance to cancer through the same mechanism.

Caetano Reis e Sousa, head of the Immunobiology Laboratory at the Crick, and senior author, said: “What we’ve shown here came as a surprise – vitamin D can regulate the gut microbiome to favour a type of bacteria which gives mice better immunity to cancer.

“This could one day be important for cancer treatment in humans, but we don’t know how and why vitamin D has this effect via the microbiome. More work is needed before we can conclusively say that correcting a vitamin D deficiency has benefits for cancer prevention or treatment.”

Evangelos Giampazolias, former postdoctoral researcher at the Crick, and now Group Leader of the Cancer Immunosurveillance Group at the Cancer Research UK Manchester Institute, said: “Pinpointing the factors that distinguish a ‘good’ from a ‘bad’ microbiome is a major challenge. We found that vitamin D helps gut bacteria to elicit cancer immunity improving the response to immunotherapy in mice.

“A key question we are currently trying to answer is how exactly vitamin D supports a ‘good’ microbiome. If we can answer this, we might uncover new ways in which the microbiome influences the immune system, potentially offering exciting possibilities in preventing or treating cancer.”

Romina Goldszmid, Stadtman Investigator in NCI’s Center For Cancer Research, said: “These findings contribute to the growing body of knowledge on the role of microbiota in cancer immunity and the potential of dietary interventions to fine-tune this relationship for improved patient outcomes. However, further research is warranted to fully understand the underlying mechanisms and how they can be harnessed to develop personalized treatment strategies.”

This research was funded by Cancer Research UK, the UK Medical Research Council, the Wellcome Trust, an ERC Advanced Investigator grant, a Wellcome Investigator Award, a prize from the Louis-Jeantet Foundation, the Intramural Research Program of the NCI, part of the National Institutes of Health, CCR-NCI and the Danish National Research Foundation.

Research Information Manager at Cancer Research UK, Dr Nisharnthi Duggan said: “We know that vitamin D deficiency can cause health problems, however, there isn't enough evidence to link vitamin D levels to cancer risk. This early-stage research in mice, coupled with an analysis of Danish population data, seeks to address the evidence gap. While the findings suggest a possible link between vitamin D and immune responses to cancer, further research is needed to confirm this.

“A bit of sunlight can help our bodies make vitamin D but you don’t need to sunbathe to boost this process. Most people in the UK can make enough vitamin D by spending short periods of time in the summer sun. We can also get vitamin D from our diet and supplements. We know that staying safe in the sun can reduce the risk of cancer, so make sure to seek shade, cover up and apply sunscreen when the sun is strong.”

-ENDS-

For further information, contact: press@crick.ac.uk or +44 (0)20 3796 5252

Notes to Editors

Reference: Giampazolias, E. et al. (2024). Vitamin D regulates microbiome-dependent cancer immunity. Science10.1126/science.adh7954.

  1. The advantage of using a Danish cohort was largely similar ancestry (approx. 86% are of Danish descent) and a ‘vitamin D winter’ due to the northernly latitude of Denmark. This means a lower rate of synthesis of vitamin D through sunlight on the skin.
  2. Higher vitamin D levels were indirectly surmised from a ‘vitamin D gene signature’, which was the combination of activity in the body related to vitamin D. This was used as there are multiple types of vitamin D, and measurements can be arbitrary and prone to error. Patients with better gene signatures responded better to immune checkpoint therapy.

The Francis Crick Institute is a biomedical discovery institute dedicated to understanding the fundamental biology underlying health and disease. Its work is helping to understand why disease develops and to translate discoveries into new ways to prevent, diagnose and treat illnesses such as cancer, heart disease, stroke, infections, and neurodegenerative diseases.

An independent organisation, its founding partners are the Medical Research Council (MRC), Cancer Research UK, Wellcome, UCL (University College London), Imperial College London and King’s College London.

The Crick was formed in 2015, and in 2016 it moved into a brand new state-of-the-art building in central London which brings together 1500 scientists and support staff working collaboratively across disciplines, making it the biggest biomedical research facility under a single roof in Europe.

http://crick.ac.uk/

 

Negativity about vaccines surged on Twitter after COVID-19 jabs become available


Number of negative tweets rose by 27%



EUROPEAN SOCIETY OF CLINICAL MICROBIOLOGY AND INFECTIOUS DISEASES






It’s time to take a new approach to addressing negative messaging about vaccines, including avoiding the use of the term “anti-vaxxers”, say the researchers.

**ECCMID has now changed name to ESCMID Global, please credit ESCMID Global Congress in all future stories**

There was a marked increase in negativity about vaccines on Twitter after COVID-19 vaccines became available, the ESCMID Global Congress (formerly ECCMID) in Barcelona, Spain (27-30 April) will hear.

The analysis also found that spikes in the number of negative tweets coincided with announcements from governments and healthcare authorities about vaccination.

It’s time to take a new approach to addressing negative messaging about vaccines, including avoiding the use of the term “anti-vaxxers”, say the researchers.

“Vaccines are one of humanity's greatest achievements,” explains lead researcher Dr Guillermo Rodriguez-Nava, of Stanford University School of Medicine, Stanford, USA.

“They have the potential to eradicate dangerous diseases such as smallpox, prevent deaths from diseases with 100% mortality rates, like rabies, and prevent cancers such as those caused by HPV.

“Moreover, vaccines can prevent complications from diseases for which we have limited treatment options, such as influenza and COVID-19, but there has been growing opposition to their use in recent years.

“The damage caused by negative voices is already apparent, with clusters of measles re-emerging in countries where it was previously considered eradicated.

“This situation harms children who cannot make decisions for themselves regarding vaccines, as well as immunocompromised patients who are unable to get vaccinated.”

Dr Rodriguez-Nava and colleagues analysed the impact of the introduction of COVID-19 vaccines on the sentiment of vaccine-related posts on Twitter.

Open-source software (the Snscrape library in Python) was used to download tweets with the hashtag “vaccine” from January 1 2018 to December 31 2022.

Cutting-edge AI methods were then used to perform sentiment analysis and classify as the tweets having either positive or negative sentiment. Finally, modelling techniques were used to create a “counterfactual scenario”.  This showed what the pattern of tweets would have looked like if COVID vaccines hadn’t been introduced in December 2020.

567,915 tweets were extracted and analysed. 458,045 classified were negative and 109,870 as positive by the machine learning algorithm. Tweets that were negative in sentiment were predominant both before and after vaccines became available

Negative tweets included: “The EU Commission should immediately terminate contracts for new doses of fake #vaccines against #COVID19 and demand the return of the 2.5 billion euros paid so far. Everyone who lied that #vaccines prevent the spread of the virus must be held accountable.”

Positive tweets included one that marked a baby receiving some of its childhood vaccinations and read: “Two month shots! #vaccines are always a reason to celebrate in our house. #VaccinesWork.” (For more examples, see link in “notes to editors”.)

After COVID vaccines were introduced, there was a marked in increase the number of tweets about vaccines, with 10,201 more vaccine-related tweets per month, on average, than would be expected if vaccination hadn’t started.

There was also a marked increase in negativity. There were 310,508 tweets (approx. 12,420 a month on average) with negative sentiment after December 11 2020.  This is 27% more than the 244,635 (9,785 a month) that would be expected if COVID vaccination hadn’t started.

The proportion of positive tweets fell from 20.3% to 18.8% after the introduction of COVID vaccines and the percentage of negative tweets rose from 79.6% to 81.1%.

Spikes in negative activity coincided with announcements about vaccination. For example, the highest number of negative tweets was in April 2021, the month the White House announced that all people aged 16 and older would be eligible for the COVID-19 vaccine. 

The lowest number of negative tweets after the introduction of COVID-19 vaccines was in April 2022, the month Elon Musk acquired Twitter.  While it isn’t known why this was, it may have been part of a seasonal pattern (the number of negative tweets tended to be highest in the winter).  It’s also possible that Twitter users were focussing on the changes to the platform that came with the new ownership, says Dr Rodriguez-Nava.

The researchers conclude: “Negative sentiments toward vaccines were already prominent on social media prior to the arrival of COVID-19 vaccines. The introduction of these vaccines significantly increased the negative sentiments on X, formerly Twitter, regarding vaccines.”

Dr Rodriguez-Nava says: “Social media has the power to exponentially amplify health messages, both beneficial and harmful, and is an arena in which political figures, actors, singers, personalities and other ‘influencers’ outnumber healthcare voices.

“Unfortunately, in some countries, negative sentiments toward vaccines are not only health-related but also religious and political.

“This is complex issue, with no easy solution, but we do need to change our approach because it is clearly not working.

“This begins with avoiding derogatory terms such as 'anti-vaxxers,' and perhaps even 'misinformation,' and approaching these individuals in a more respectful and understanding manner.

“Additionally, healthcare leaders should dedicate more effort to collaborating with social media influencers, religious leaders and lawmakers, who may be more trusted by their communities than healthcare professionals and more effective in amplifying a positive message.

“Social media companies also have a role to play.  However, this is also a complex issue because each company may have different values and attitudes to free speech and countries may have different laws for free speech.”

 

 

Anthropologist documents how women and shepherds historically reduced wildfire risk in Central Italy




UNIVERSITY OF CALIFORNIA - SANTA CRUZ




In the last several decades, large forest fires have increasingly threatened communities across the Mediterranean. Climate change is expected to make these fires larger, hotter, and more dangerous in the future. But fire management lessons from the past could help to improve the resilience of local landscapes. 

The latest research paper from environmental anthropologist and University of California, Santa Cruz Professor Andrew Mathews explores these issues in the Monte Pisano region of Central Italy. In particular, Mathews found that peasant women, who historically collected leaf litter in the forests, and shepherds, who grazed their flocks and conducted occasional managed burns, were critical in maintaining fire-resistant landscapes. Yet the social status of these groups meant the importance of their work went unrecognized.

In Monte Pisano and much of the broader Mediterranean, forests and other plant communities have been shaped by thousands of years of intensive human management of the land. But migration to cities since the 1960s has left rural lands increasingly abandoned. And without people to maintain them, local forests have become overgrown with highly flammable brush. 

At the same time, many traditional rural land management practices that may have once reduced fire risk in the region have been systematically ignored and even criminalized over the years, to the point where they have been all but forgotten. 

Luckily, though, there are a few people who still remember. Mathews and his research team sought out elderly people who were born between 1928 and 1956 in the Monte Pisano region and conducted oral history interviews to learn about traditional land management practices. In particular, the researchers asked about activities like collecting leaf litter, livestock grazing, and managed burning, which historical records suggested may have once been common. 

“The people we interviewed were actually kind of excited to tell us these stories,” Mathews said. “Most people don't really ask them detailed questions about their daily lives from when they were younger, so they enjoyed retelling the stories, and they were such brilliant, thoughtful, interesting people. They were a lot of fun to talk to.”

Research participants described how forests were once full of human activity. Leaves were raked for use as stable bedding and fertilizer for olive groves. Logs and brush were collected for firewood and kindling. People gathered herbs, berries, and mushrooms in the forest, and sheep ate the grasses. Every scrap of wood or vegetation had a use, so the forest floor was almost bare in some places, and forests had an open, park-like appearance. 

Meanwhile, in nearby pastures and olive groves, the buildup of grasses, brush, and brambles was kept under control through a combination of livestock grazing, manual brush cutting, and occasional managed burning. And whenever a wildfire sprang up in the forest, someone was always nearby to quickly extinguish it. 

These historical accounts of the landscape were “an extraordinary difference” from what Matthews observed during forest transect walks in Monte Pisano in 2014. He and a botanist assistant recorded dense scrub and thick leaf litter, plus abundant fallen branches and brush that could easily act as “ladder fuels,” enabling flames to spread from the forest floor to the treetops.

Mathews wanted to estimate how much of this difference between modern and historical landscapes could reasonably be attributed to past land management practices. Since sheep were central to many of those practices—like leaf litter raking for stable bedding and grazing herds of sheep in forests—he started by comparing accounts from his oral history interviews with historical agricultural records to estimate the historical number of sheep per hectare of land in the region. 

prior study had modeled historical biomass removal in the Valais region of Switzerland for similar activities and a roughly comparable sheep-to-land ratio. So, based on the lowest estimates from that prior research, Mathews calculated that leaf litter raking alone could have historically extracted about 30-40% of the vegetation produced annually within the forest, with additional vegetation removal resulting from grazing, firewood collecting, and other activities. 

These effects would have dramatically altered the landscape, leaving very little fuel for forest fires. Yet Mathews found that most people in the region today have very little awareness of these traditional land management practices that historically reduced fire risk. The research team interviewed local residents, firefighters, and government officials and observed community events to see what people understood about the history of local fire management. 

“There was almost a complete disconnect,” Mathews said. “People have a general idea that landscape abandonment is a problem, but most have no idea that there was a history of controlled burning and care that made the landscape less flammable.”

The causes of this collective forgetting are rooted in historical politics of classism and sexism, Mathews’ research suggests. 

Leaf litter raking and other land management activities were conducted by peasants, and oral history interviews further showed that it was typically women and children who did this work. The state considered peasant practices to be backwards and outdated amidst a push for agricultural modernization. And forestry policies that focused heavily on timber production led to the banning and stigmatization of traditional managed burning. 

“Leaf litter raking was disregarded by the state because no one was earning money from it, and it was ‘women’s work’ being done by ‘unimportant’ people,” Matthews explained. “Similarly, shepherds, who were often the ones doing managed burning, have a long history of being stereotyped and regarded with suspicion across the Mediterranean. So the government never understood what they were doing or thought it was helpful.” 

Mathews believes that communities around the world can learn from Monte Pisano’s traditional fire management practices, as well as from the consequences of forgetting them. He says that landscape abandonment similar to what took place in Italy in the mid- to late- 20th Century is currently happening in parts of Africa and South America. With that, there’s been a decrease in traditional managed burning on a global scale. 

“We tend to think of fire as increasing around the world due to climate change, but at the same time, these traditional types of smaller, controlled fires are actually decreasing,” he said. “We should think hard about the impacts of eliminating agropastoral burning, because, in the end, it's likely to come back and bite us in the form of much larger fires.” 

New offshore wind turbines can take away energy from existing ones



UNIVERSITY OF COLORADO AT BOULDER
CU Boulder team set up a wind measurment device near Rhode Island in December, 2023 

IMAGE: 

A WIND LIDAR FOR COLLECTING DATA ON WIND ENERGY, WEATHER AND AIR MOVEMENTS.

view more 

CREDIT: JULIE LUNDQUIST/CU BOULDER




As summer approaches, electricity demand surges in the U.S., as homes and businesses crank up the air conditioning. To meet the rising need, many East Coast cities are banking on offshore wind projects the country is building in the Atlantic Ocean.

For electric grid operators, knowing how much wind power these offshore turbines can harvest is critical, but making accurate predictions can be difficult. A team of scientists at the University of Colorado Boulder and their collaborators are working to tackle the challenge. 

In a new paper published March 14 in the journal Wind Energy Science, a team led by Dave Rosencrans, a doctoral student, and Julie K. Lundquist, a professor in the Department of Atmospheric and Ocean Sciences, estimate that offshore wind turbines in the Atlantic Ocean region, where the U.S. plans to build large wind farms, could take away wind from other turbines nearby, potentially reducing the farms’ power output by more than 30%. 

Accounting for this so-called “wake effect,” the team estimated that the proposed wind farms could still supply approximately 60% of the electricity demand of the New England grid, which covers Connecticut, Maine, Massachusetts, New Hampshire, Rhode Island, and Vermont.

“The U.S. is planning to build thousands of offshore wind turbines, so we need to predict when those wakes will be expensive and when they have little effect,” said Lundquist, who is also a fellow at CU Boulder’s Renewable and Sustainable Energy Institute. 

Understanding the wake effect

When wind passes through turbines, the ones at the front, or upstream, extract some energy from the wind. As a result, the wind slows down and becomes more turbulent behind the turbines. This means the turbines downstream get slower wind, sometimes resulting in lower power generation.

The wake effect is particularly prominent offshore, because there are no houses or trees that stir up the air, which helps dissipate the wakes, said Rosencrans, the paper’s first author. 

Using computer simulations and observational data of the atmosphere, the team calculated that the wake effect reduces total power generation by 34% to 38% at a proposed wind farm off the East Coast. Most of the reduction comes from wakes formed between turbines within a single farm. 

But under certain weather conditions, wakes could reach turbines as far as 55 kilometers downwind and affect other wind farms. For example, during hot summer days, the airflow over the cool sea surface tends to be relatively stable, causing wakes to persist for longer periods and propagate over longer distances. 

“Unfortunately, summer is when there's a lot of electrical demand,” Rosencrans said. “We showed that wakes are going to have a significant impact on power generation. But if we can predict their effects and anticipate when they are going to happen, then we can manage them on the electrical grid.” 

A balancing act

In early 2024, five looming wind turbines off the coast of Massachusetts from the country’s first large-scale offshore wind project delivered the first batch of wind power to the New England grid. More turbines are under construction off the coasts of Rhode Island, Virginia and New York. The Biden Administration has set a goal to install 30 gigawatts of offshore wind capacity by 2030, which is enough to power more than 10 million homes for a year.

Compared with energy sources derived from fossil fuels, wind and solar power tend to be variable, because the sun doesn’t always shine and the wind doesn’t always blow. 

This variability creates a challenge for grid operators, said Lundquist. The power grid is a complex system that requires a perfect balance of supply and demand in real-time. Any imbalances could lead to devastating blackouts, like what happened in Texas in 2021 when power outages killed nearly 250 people. 

As the country continues to expand renewable energy projects and integrates more clean electricity into the power system, grid operators need to know precisely how much energy from each renewable source they can count on.

To better understand how the wind blows in the proposed wind farm area, Lundquist’s team visited islands off the New England coast and installed a host of instruments last December as part of the Department of Energy's Wind Forecast Improvement Project 3. The project is a collaboration of researchers from CU Boulder, Woods Hole Oceanographic Institute and several other national laboratories. 

The instruments, including weather monitors and radar sensors, will collect data for the next year or more. Previously, offshore wind power prediction models usually relied on intermittent data from ships and satellite observations. The hope is that with continuous data directly from the ocean, scientists can improve prediction models and better integrate more offshore wind energy into the grid. 

In addition to the growing demand for air conditioning and heat pumps, electricity consumption in the U.S. has been rising rapidly in recent years because of the increasing prevalence of electric vehicles, data centers and manufacturing facilities. Over the next five years, analyses project that electricity demand in the U.S. will increase by nearly 5%, a substantial increase compared with the estimated annual growth rate of 0.5% over the past decade. 

“We need a diverse mix of clean energy sources to meet the demand and decarbonize the grid,” Lundquist said. “With better predictions of wind energy, we can achieve more reliance on renewable energy.”

 

Curiosity promotes biodiversity




UNIVERSITY OF BASEL
Cyphotilapia gibberosa 

IMAGE: 

CYPHOTILAPIA GIBBEROSA IS ONE OF THE PARTICULARLY CURIOUS CICHLIDS IN LAKE TANGANYIKA. 

view more 

CREDIT: UNIVERSITY OF BASEL, ADRIAN INDERMAUR




Exploratory behavior is one of the fundamental personality traits of animals – and these traits influence their probability of survival, among other things. For example, curious individuals can inhabit different areas in their habitats compared to more cautious conspecifics. At the same time, however, they expose themselves to a greater risk of being discovered and eaten.

Exploratory behavior as a factor in evolution

The cichlids of Africa’s Lake Tanganyika exhibit extraordinary diversity in terms of shape, diet, habitat and coloration. This allows them to inhabit various ecological niches and therefore to engage in less competition with one another. Researchers have long suspected that also curiosity acts as a driver in the formation of new species and therefore biodiversity. Now, a research team led by Professor Walter Salzburger from the University of Basel has used the example of the extremely diverse cichlid fishes of Lake Tanganyika to investigate the role of behavioral differences in adaptation to different ecological niches.

For a total of nine months, first author Dr. Carolin Sommer-Trembo recorded the “exploratory behavior” of 57 different cichlid species at the Southern shore of Lake Tanganyika in Zambia. To this end, the zoologist made video recordings of how the approximately 700 cichlids caught in the lake behaved in a new environment in form of large experimental ponds. She then released the animals back into the wild.

Back in Basel, Sommer-Trembo used these videos to determine which areas of the experimental pond each fish explored within a 15-minute period. “On the whole, large differences in exploratory behavior were observed between the cichlid species, and these differences were also confirmed under laboratory conditions,” says the evolutionary biologist. Detailed analyses of the data revealed a strong correlation between exploratory behavior and the habitat – and body shape – of the respective cichlid species. For example, species that live near the shores, with a bulkybody shape, are more curious than elongated species living in open water. “This puts the focus back on animal behavior as driving force behind key evolutionary processes,” says Sommer-Trembo.

Specific mutations make the fish more curious

In order to investigate the genetic basis of the observed behavioral differences in cichlids, the research team worked together with Dr. Milan Malinsky from the University of Bern to develop a new method for analyzing the existing genomes that allowed them to compare data across different species.

Using their new method, the researchers identified a genetic variant in the genome of cichlids that showed a near perfect correlation with exploratory behavior: species with a “T” at this specific position in the DNA are curious, whereas species with a “C” are less exploratory.

When the researchers used the “genetic scissors” CRISPR-Cas9 to induce targeted mutations in the corresponding region of the genome, the exploratory behavior of the fish changed – they became more curious. Moreover, the team was able to use artificial intelligence and information about the genetic variant, body structure and habitat to predict the exploratory behavior of cichlid species that, initially, had not been examined for their exploratory behavior.

Implications for human behavior?

The genetic variant identified by the researchers is located in the immediate vicinity of the gene cacng5b, which shows activity in the brain. This is the “fishy” version of a gene that is also found in other vertebrates. For example, the human variant is associated with psychiatric diseases such as schizophrenia and bipolar disorders, which may in turn be correlated with personality disorders.

“We’re interested in how personality traits can affect mechanisms of biodiversity in the animal kingdom,” says Sommer-Trembo. “But who knows: ultimately, we might also learn something about the foundations of our own personality.”

Individuals of the species Cyprichromis coloratus have a medium curiosity.

 

Climate change could become the main driver of biodiversity decline by mid-century



Largest modelling study of its kind, published in Science



GERMAN CENTRE FOR INTEGRATIVE BIODIVERSITY RESEARCH (IDIV) HALLE-JENA-LEIPZIG

Bird before the sun 

IMAGE: 

CLIMATE CHANGE COULD BECOME THE MAIN DRIVER OF BIODIVERSITY DECLINE BY MID-CENTURY, ACCORDING TO A NEW STUDY PUBLISHED IN SCIENCE

view more 

CREDIT: OLIVER THIER




Global biodiversity has declined between 2% and 11% during the 20th century due to land-use change alone, according to a large multi-model study published in Science. Projections show climate change could become the main driver of biodiversity decline by the mid-21st century. 

The analysis was led by the German Centre for Integrative Biodiversity Research (iDiv) and the Martin Luther University Halle-Wittenberg (MLU) and is the largest modelling study of its kind to date. The researchers compared thirteen models for assessing the impact of land-use change and climate change on four distinct biodiversity metrics, as well as on nine ecosystem services.

GLOBAL BIODIVERSITY MAY HAVE DECLINED BY 2% TO 11% DUE TO LAND-USE CHANGE ALONE

Land-use change is considered the largest driver of biodiversity change, according to the Intergovernmental Platform on Biodiversity and Ecosystem Services (IPBES). However, scientists are divided over how much biodiversity has changed in past decades. To better answer this question, the researchers modelled the impacts of land-use change on biodiversity over the 20th century. They found global biodiversity may have declined by 2% to 11% due to land-use change alone. This span covers a range of four biodiversity metrics1 calculated by seven different models.

“By including all world regions in our model, we were able to fill many blind spots and address criticism of other approaches working with fragmented and potentially biased data”, says first author Prof Henrique Pereira, research group head at iDiv and MLU. “Every approach has its ups and downsides. We believe our modelling approach provides the most comprehensive estimate of biodiversity trends worldwide.”

MIXED TRENDS FOR ECOSYSTEM SERVICES

Using another set of five models, the researchers also calculated the simultaneous impact of land-use change on so-called ecosystem services, i.e., the benefits nature provides to humans. In the past century, they found a massive increase in provisioning ecosystem services, like food and timber production. By contrast, regulating ecosystem services, like pollination, nitrogen retention, or carbon sequestration, moderately declined.

CLIMATE AND LAND-USE CHANGE COMBINED MIGHT LEAD TO BIODIVERSITY LOSS IN ALL WORLD REGIONS

The researchers also examined how biodiversity and ecosystem services might evolve in the future. For these projections, they added climate change as a growing driver of biodiversity change to their calculations.

Climate change stands to put additional strain on biodiversity and ecosystem services, according to the findings. While land-use change remains relevant, climate change could become the most important driver of biodiversity loss by mid-century. The researchers assessed three widely-used scenarios – from a sustainable development to a high emissions scenario. For all scenarios, the impacts of land-use change and climate change combined result in biodiversity loss in all world regions.

While the overall downward trend is consistent, there are considerable variations across world regions, models, and scenarios.

PROJECTIONS ARE NOT PREDICTIONS

“The purpose of long-term scenarios is not to predict what will happen,” says co-author Dr Inês Martins from the University of York. “Rather, it is to understand alternatives, and therefore avoid these trajectories, which might be least desirable, and select those that have positive outcomes. Trajectories depend on the policies we choose, and these decisions are made day by day.” Martins co-led the model analyses and is an alumna of iDiv and MLU.

The authors also note that even the most sustainable scenario assessed does not deploy all the policies that could be put in place to protect biodiversity in the coming decades. For instance, bioenergy deployment, one key component of the sustainability scenario, can contribute to mitigating climate change, but can simultaneously reduce species habitats. In contrast, measures to increase the effectiveness and coverage of protected areas or large-scale rewilding were not explored in any of the scenarios

MODELS HELP IDENTIFY EFFECTIVE POLICIES

Assessing the impacts of concrete policies on biodiversity helps identify those policies most effective for safeguarding and promoting biodiversity and ecosystem services, according to the researchers. “There are modelling uncertainties, for sure”, Pereira adds. “Still, our findings clearly show that current policies are insufficient to meet international biodiversity goals. We need renewed efforts to make progress against one of the world’s largest problems, which is human-caused biodiversity change.”

1global species richness, local species richness, mean species habitat extent, biodiversity intactness