Thursday, April 17, 2025

 

The eukaryotic cell emerged as an evolutionary algorithmic phase transition




Eukaryogenesis happened abruptly, led by the tension of increasing gene length and the constraints on producing longer proteins




Johannes Gutenberg Universitaet Mainz

Evolution of protein-coding gene length distributions 

image: 

Evolution of protein-coding gene length distributions across evolutionary time

view more 

Credit: ill./©: Fernando J. Ballesteros




An international collaboration between four senior scientists from Mainz, Valencia, Madrid, and Zurich has published groundbreaking research in the journal PNAS, shedding light on the most significant increase in complexity in the history of life's evolution on Earth: the origin of the eukaryotic cell. While the endosymbiotic theory is widely accepted, the billions of years that have passed since the fusion of an Archaea and a Bacteria have resulted in a lack of evolutionary intermediates in the phylogenetic tree until the emergence of the eukaryotic cell. It is a gap in our knowledge, referred to as the black hole at the heart of biology. "The new study is a blend of theoretical and observational approaches that quantitatively understands how the genetic architecture of life was transformed to allow such an increase in complexity", stated Dr. Enrique M. Muro, representative of Johannes Gutenberg University Mainz (JGU) in this project.

Proteins and protein coding genes increase in length

The article in PNAS demonstrates that the distributions of protein lengths and their corresponding genes follow log-normal distributions across the whole tree of life. To do this, 9,913 different proteomes and 33,627 genomes were analyzed. Log-normal distributions typically arise as a result of multiplicative processes. Following Ockham's razor principle, the researchers modeled the evolution of gene length distributions as multiplicative stochastic processes. In fact, they modeled the action of all genetic operators combined in relation to sequence length. Starting from LUCA, i.e., the hypothesized last universal common ancestor from which the three domains of life – the Bacteria, the Archaea, and the Eukarya – originated, the researchers found both theoretically and observationally that the average gene lengths have evolved exponentially over evolutionary time across different species. Furthermore, they discovered a scaling-invariant mechanism of gene growth across the entire tree of life, where the variance directly depends on the mean protein length. By representing all the species captured in the 33,627 genomes, the team was able to observationally verify the predictions and, moreover, show that the average gene length is a very good surrogate for organismal complexity. In a pure exercise of quantitative biology, Dr. Bartolo Luque from the Polytechnic University of Madrid added: "From knowing the average length of protein-coding genes in a species, we can calculate the whole distribution of gene length within that species."

When representing the evolution of the average protein lengths versus their corresponding gene lengths across different species, it is observed that they evolve simultaneously in prokaryotes, because there are almost no non-coding sequences in their genes. However, once the average gene length reaches 1,500 nucleotides, the proteins decouple from the multiplicative process of gene growth, and the average protein length stabilizes after the onset of the eukaryotic cell at about 500 amino acids in a clear threshold, marking the appearance of the eukaryotic cell. From that point onward, and unlike what happens with proteins, the average gene length continues to increase as it did in prokaryotes, due to the presence of non-coding sequences.

Algorithmic phase transition

A critical phenomena analysis then concluded that a phase transition, well studied in the physics of magnetic materials, occurred at a critical gene length of 1,500 nucleotides. This marked eukaryogenesis and divides the evolution of life into two distinct phases: a coding phase (Prokarya) and a non-coding phase (Eukarya). Additionally, characteristic phenomena of these transitions are observed, such as critical slowing down, where the system's dynamics become trapped in many metastable states around the critical point. "This is corroborated in early protists and fungi", said Dr. Fernando Ballesteros from the University of Valencia.

Moreover, "the phase transition was algorithmic," added Professor Jordi Bascompte from the University of Zurich. In the coding phase, in a scenario close to LUCA, with short proteins, increasing the length of proteins and their corresponding genes was computationally simple. However, as the protein lengths grew, the search for longer proteins became unfeasible. This tension caused by genes that grew at the same rate as before while proteins could not was resolved continuously but abruptly with the incorporation of non-coding sequences into the genes. With this innovation, the algorithm for searching for new proteins rapidly reduced its computational complexity, becoming non-linear through the spliceosome and the nucleus, which separated transcription and splicing from translation. This happened at the critical point of phase transition, which this study dates to 2.6 billion years ago.

The study recently published in PNAS not only answers essential questions, but is interdisciplinary, combining computational biology, evolutionary biology, and physics. "It has the potential to interest a wide audience across many disciplines and serve as a foundation for other groups to explore different research avenues, such as energy or information theory", emphasized Dr. Enrique Muro of the Institute of Organismic and Molecular Evolution at Mainz University. The eukaryotic cell, the most significant increase in complexity in the history of life's evolution on Earth, emerged as a phase transition and unlocked the path toward other major transitions – such as multicellularity, sexuality, and sociability – that shaped life on our planet as we know it today.

 

Dr. Enrique Muro of Mainz University showing one of the pieces of evidence of the phase transition using the research team's work repository on GitHub

Credit

photo/©: Juan Mac Donagh

Related links:

 

Nutrients strengthen link between precipitation and plant growth, study finds



International team of researchers analysed 71 grasslands on six continents.



German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig

NutNet Jena 

image: 

NutNet side at the Jena Experiment in Germany

view more 

Credit: Christiane Roscher





A new study published in PNAS, led by the United States Department of Agriculture and involving several researchers from the German Centre for Integrative Biodiversity Research (iDiv), the Helmholtz Centre for Environmental Research (UFZ), Martin Luther University Halle-Wittenberg (MLU) and Leipzig University, investigated how the relationship between mean annual precipitation (MAP) and grassland biomass changes when one or more nutrients are added. The authors show that precipitation and nutrient availability are the key drivers of plant biomass, while the effects of plant diversity are minimal.

Global changes in precipitation are putting ecosystems around the world to the test: while MAP is increasing in some regions, it is decreasing in others. At the same time, many ecosystems are being exposed – both intentionally and unintentionally – to increasing amounts of nutrients, such as nitrogen, phosphorus and potassium. This may result from agricultural practices like land use and fertilisation, or from urban runoff, which includes wastewater, entering nearby waterways. Both factors – precipitation and nutrients – directly affect plant biomass production.

The study found that plant biomass is closely related to mean annual precipitation in grasslands. On average, plant biomass increases with higher MAP, but other factors come into play as well, such as nutrient availability.

Until now, little has been known about how the relationship between MAP and plant biomass changes when one or more nutrients are added.

To answer this question, the researchers measured aboveground plant biomass and species diversity in 71 grasslands on six continents, including native and planted grasslands with varying soil textures and nutrient contents, as well as management regimes. All sites were part of the Nutrient Network, including the NutNet sites at the Jena Experiment and at the UFZ Research Station in Bad Lauchstädt, Germany. This experimental network follows a standardised methodology applied across all 130 sites globally, ensuring comparability of data.

Multiple nutrients, more biomass

To look at the effect of nutrients in combination with MAP, the researchers fertilised all sites with nitrogen, phosphorus and potassium – in all possible combinations. Fertilisation generally boosted plant growth and biomass. Notably, when multiple nutrients – particularly nitrogen and phosphorus – were added, plants became more responsive to rainfall, which led to a stronger relationship between biomass and precipitation.

While this outcome did not come as a surprise, the study also revealed something unexpected: factors like species diversity had little influence on the biomass-MAP relationship within the plant community. However, in sites that were not limited by nitrogen or phosphorus, the link between precipitation and biomass became clearer – a pattern overlooked in earlier analyses that did not account for biodiversity. “Adding nutrients decreases plant diversity, but the indirect effect of diversity change on biomass is weaker than the direct effects of MAP and nutrients,” says co-author Stan Harpole, head of Physiological Diversity at UFZ, iDiv and MLU. “Even though the effects on diversity don’t carry over strongly to biomass, considering plant diversity is still important for accurately understanding how rainfall influences the biomass in ecosystems where nutrient availability (like nitrogen and phosphorus) is not a limiting factor.”

The results highlight that factors like precipitation and nutrient availability have a stronger influence than plant diversity.

Nutrient interactions shape the MAP-biomass relationship

The new findings support the idea that ecosystems are often co-limited by multiple nutrients. Accordingly, fertilisation with more than one nutrient can promote the relationship between biomass and precipitation – with interactions between nitrogen and phosphorus playing a particularly important role. To fully understand how grassland ecosystems respond to global changes in precipitation and nutrient enrichment, both nutrient interactions and changes in plant communities should be considered. Understanding nutrient limitations and their impact on biomass production can help inform land management and conservation strategies for grassland ecosystems worldwide.

 

Using AI to predict athletes' risk of lower-extremity injury after concussion



Machine analyzes more than 100 variables, including sports and medical histories



University of Delaware

Game-changing tool 

image: 

University of Delaware field hockey player Izzy Rothwell, a junior majoring in neuroscience, demonstrates baseline concussion testing with Kinesiology and Applied Physiology Professor Thomas Buckley in the Concussion Research Laboratory.

view more 

Credit: Ashley Barnas Larrimore/ University of Delaware




Athletes who suffer a concussion have a serious risk of reinjury after returning to play, but identifying which athletes are most vulnerable has always been a bit of a mystery, until now. 

Using artificial intelligence (AI), University of Delaware researchers have developed a novel machine learning model that predicts an athlete’s risk of lower-extremity musculoskeletal (MKS) injury after concussion with 95% accuracy. A recent study published in Sports Medicine details the development of the AI model, which builds on previously published research showing that the risk of post-concussion injury doubles, regardless of the sport. The most common post-concussive injuries include sprains, strains, or even broken bones or torn ACLs.

“This is due to brain changes we see post-concussion,” said Thomas Buckley, professor of kinesiology and applied physiology at the College of Health Sciences.

These brain changes affect athletes’ balance, cognition, and reaction times and can be difficult to detect in standard clinical testing. 

“Even a minuscule difference in balance, reaction time, or cognitive processing of what’s happening around you can make the difference between getting hurt and not,” Buckley said.

How AI is changing injury risk assessment

Recognizing the need for enhanced injury reduction risk tools, Buckley collaborated with colleagues in UD’s College of EngineeringAustin Brockmeier, assistant professor of electrical and computer engineering, and César Claros, a fourth-year doctoral student; Wei Qian, associate professor of statistics in the College of Agriculture and Natural Resources; and former KAAP postdoctoral fellow Melissa Anderson, who’s now an assistant professor at Ohio University. To assess injury risk, Brockmeier and Claros developed a comprehensive AI model that analyzes more than 100 variables, including sports and medical histories, concussion type, and pre- and post-concussion cognitive data. 

“Every athlete is unique, especially across various sports,” said Brockmeier. “Tracking an athlete’s performance over time, rather than relying on absolute values, helps identify disturbances, deviations, or deficits that, when compared to their baseline, may signal an increased risk of injury.” 

While some sports, such as football, carry higher injury risk, the model revealed that individual factors are just as important as the sport played.

“We tested a version of the model that doesn’t have access to the athlete’s sport, and it still accurately predicted injury risk,” Brockmeier said. “This highlights how unique characteristics—not just the inherent risks of a sport—play a critical role in determining the likelihood of future injury,” said Brockmeier.  

The research, which tracked athletes over two years, also found that the risk of MSK injury post-concussion extends well into the athlete’s return to play. 

“Common sense would suggest that injuries would occur early in an athlete’s return to play, but that’s simply not true,” said Buckley. “Our research shows that the risk of future injury increases over time as athletes compensate and adapt to small deficits they may not even be aware of.” 

From research to real-world injury reduction

The next step for Buckey’s Concussion Research Lab is to further collaborate with UD Athletics’ strength and conditioning staff to design real-time interventions that could reduce injury risk. 

Dan Watson, deputy athletic director of competitive excellence and campus recreation, said the AI model can help them target high-risk athletes and incorporate strategies to reduce injury risk. 

“In sport performance, we have two goals: improve the athlete's abilities in their sport and to keep them on the field,” said Watson.

UD Athletics already uses force plates to analyze movement and detect muscle imbalances or weaknesses, the leading cause of soft tissue injuries. Watson says that the same proactive approach applies to concussion-related deficits. 

“We’re very open to anything that keeps our athletes healthy and on the field,” said Watson. “When this predictive learning model identifies a deficit, we can proactively implement corrective measures.” We can’t prevent injuries, but we can reduce and mitigate the risks, and that’s what this model does for athletics.” 

Beyond sports: AI’s potential in aging research

The implications of the UD-developed machine-learning model extend far beyond sports. Brockmeier believes the algorithm could be used to predict fall risk in patients with Parkinson’s disease. 

Claros is also exploring how the injury risk reduction model can be applied to aging research with the Delaware Center for Cognitive Aging. 

“We want to use brain measurements to investigate whether baseline lifestyle measurements such as weight, BMI, and smoking history are predictive of future mild cognitive impairment or Alzheimer’s disease,” said Claros.

 

 

Bluetooth-based technology could help older adults stay independent



A new “indoor positioning system” works better than GPS for tracking precise indoor locations and could be a boon to healthcare and other fields



PLOS

Bluetooth-based technology could help older adults stay independent 

image: 

The McMaster Smart Home for Aging-in-Place (SHAPE) facility is a unique research facility that was converted from a 100-year-old house in a residential community.  It was used in this project to validate the indoor positioning system for home use.

view more 

Credit: Qiyin Fang and Guha Ganesh, CC-BY 4.0 (https://creativecommons.org/licenses/by/4.0/)





A new Bluetooth-based positioning system could offer healthcare systems a low-energy, low-cost method of tracking older adults’ mobility, according to a new study published this week in the open-access journal PLOS Digital Health by Qiyin Fang of McMaster University, Canada, and colleagues.

Global Positioning System (GPS) is the dominant positioning technology today, but its use indoors is limited due to the difficulty in communicating with GPS satellites. However, the knowledge of a person’s position is critical for many real-time healthcare applications, including monitoring older adults in both their homes and long-term care facilities.

In the new study, researchers designed, developed and validated a Bluetooth Low Energy (BLE)- based Indoor Positioning System (IPS) that measures and tracks specific users’ locations continuously throughout the day.  IPS relies on small, wireless beacons plugged into wall outlets and wearable Bluetooth tags. Importantly, it does not require professional setup or detailed floorplans and costs about $200 for a typical setup involving five beacons.

To test the BLE-based IPS, scientists placed the system in two different suburban homes. The system was able to correctly identify a person’s location 96% of the time when it combined Bluetooth signals with motion sensors. The research team suggested that the technology could be used to monitor people’s locations as well as for tracking of medical equipment in dynamic environments such as hospitals.

“The potential for real-time, context-aware healthcare monitoring can significantly improve patient safety and caregiver efficiency, driving further research in this area,” the authors say.

The authors add: “The McMaster team developed an indoor positioning system specifically for tracking older adults' mobility in their own homes.  The technology was designed for low-cost implementation and operation that is critical for use in existing homes."  

 

####

 

In your coverage please use this URL to provide access to the freely available article in PLOS Digital Health: https://plos.io/4cru1Tl

Citation: Wang H, Ganesh G, Zon M, Ghosh O, Siu H, Fang Q (2025) A BLE based turnkey indoor positioning system for mobility assessment in aging-in-place settings. PLOS Digit Health 4(4): e0000774. https://doi.org/10.1371/journal.pdig.0000774

Author countries: Canada

Funding: This work was supported in part by funding from AGE-WELL NCE (Networks of Centres of Excellence) and McMaster Institute of Research in Aging (MIRA) (AW-MIRA-SIPA-Apr19-055 to QF and HS; AW-MIRA-HQP2022-02 to HW); National Research Council (NRC) of Canada (AiP426-1 to QF), Natural Science and Engineering Research Council (NSERC) of Canada (RGPIN-2019-07127 to QF), and Canadian Foundation of Innovation (CFI)/Ontario Research Fund-Research Infrastructure (ORF-RI) (to QF). HW and GG were supported by the NSERC sMAP CREATE program (to QF). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

 

Breaking the American climate silence


Worry, social norms, exposure to media coverage may predict whether Americans discuss climate change




PLOS

Breaking the American climate silence 

image: 

A climate conversation  

view more 

Credit: Photo by Aarón Blanco Tejedor on Unsplash; free to use under the Unsplash License





Americans are more likely to discuss climate change with family and friends if they feel worried or at risk, perceive society as supportive of pro-climate behaviors or see global warming depicted in the media, according to a study published April 17, 2025 in the open-access journal PLOS Climate by Margaret Orr from George Mason University College of Humanities and Social Sciences in Fairfax, Virginia, and colleagues.

Curbing global warming’s effects requires systemic changes and government policies to reduce carbon emissions. Among individual climate actions (e.g., avoiding meat and limiting air travel), discussing climate change with others can foster wider collective support for climate policies. However, only one-third of Americans report discussing climate change with family and friends even occasionally, leading to the trend dubbed “climate silence.”

To learn more about what prompts climate-focused discussions, Orr and her colleagues analyzed a nationally representative, cross-sectional survey of 3,079 American adults distributed between 2020 and 2021. The survey included questions like, “How worried are you about global warming?” and, “About how often do you hear about global warming in the media?”, and all participants indicated how often they discuss global warming with family and friends.

Results identified four factors that may predict climate change discussions: worry, perceived risk, perceived social norms and exposure to media coverage about global warming. Perceived social norms — perceptions about how members of society think, feel and behave — emerged as the strongest predictor.

With these findings, the researchers aim to support climate conversations through several methods, including using news, entertainment and social media to popularize climate action; highlighting risk factors or causes of worry (and pairing those issues with solutions to prevent mental distress); and encouraging climate journalism.

The researchers note that this study measured the frequency of climate conversations, not their content; future research might explore how and why content may differ. Additionally, this study focused on familial and friendship relationships among American adults; future studies might expand into other relationship types — like neighbors or coworkers — and into vulnerable communities.

 

The authors add: “We found that the more worried people are about climate change, the more they hear about it in the media, and the more they perceive themselves to be at risk from the impacts of climate change, they are in turn more likely to talk about climate change with others.”

####

In your coverage please use this URL to provide access to the freely available article in PLOS Climatehttps://plos.io/428RLrV

Citation: Orr M, Borth A, Kotcher J, Campbell E, Myers T, Maibach E, et al. (2025) Breaking the climate silence: Predictors of discussing global warming with family and friends. PLOS Clim 4(4): e0000538. https://doi.org/10.1371/journal.pclm.0000538

Author countries: United States

Funding: This research is part of the Climate Change in the American Mind project jointly conducted by George Mason University and Yale University. This research was funded by the Schmidt Family Foundation (AL), the U.S. Energy Foundation (AL, EM), the MacArthur Foundation (AL), and the Grantham Foundation (AL). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of this manuscript. AL & SR received salary support from the above funders.