Saturday, April 20, 2024

SPACE

Mapping plant functional diversity from space: HKU ecologists revolutionize ecosystem monitoring with novel field-satellite integration



THE UNIVERSITY OF HONG KONG
Mapping Plant Functional Diversity from Space: HKU Ecologists Revolutionise Ecosystem Monitoring with Novel Field-Satellite Integration 

IMAGE: 

HIGH-RESOLUTION SATELLITE IMAGES THAT CAPTURED MULTISPECTRAL DATA RECORDED THE REFLECTIONS OF LIGHT FROM PLANT LEAVES. THESE DATA ARE NOT ONLY OF GREAT RESEARCH IMPORTANCE, PROVIDING VALUABLE INSIGHTS INTO THE PHYSICAL AND BIOCHEMICAL PROPERTIES OF VEGETATION, BUT ALSO SHOWCASE STUNNING PATTERNS. 

view more 

CREDIT: IMAGES ADAPTED FROM REMOTE SENSING OF ENVIRONMENT, 2024, DOI.ORG/10.1016/J.RSE.2024.114082.




An international team of researchers, led by Professor Jin WU from the School of Biological Sciences at The University of Hong Kong (HKU), has made a promising advancement in mapping plant functional traits from space using time-series satellite data. The study, published in Remote Sensing of Environment, showcases the innovative combination of the Sentinel-2 satellite mission and its dynamic time-series capabilities. This innovative approach not only unlocks a deeper understanding of essential foliar traits, providing crucial insights into the functional diversity and ecosystem functioning of terrestrial ecosystems, but it also equips us with powerful tools to address pressing environmental challenges effectively.

Leveraging the Satellites for In-depth Observations
Plant traits are vital in regulating key ecosystem processes such as carbon sequestration, air temperature regulation, and large-scale hydrological regulation. They also determine how ecosystems respond to various environmental stressors, ultimately determining their health, resilience, and vulnerability to climate change. However, large-scale mapping of these traits has been challenging due to limitations in existing methodologies, such as the difficulty in capturing traits across vast areas and issues such as data availability, trait complexity, and measurement techniques.

To overcome these challenges, Professor Wu’s team harnessed the power of satellite technology and introduced a pioneering approach that combines vegetation spectroscopy and phenology. Their approach utilised high-resolution imagery from the Sentinel-2 satellite, which captured multispectral data on a weekly interval with a 10-metre resolution. By analysing these satellite images, the team observed and recorded the reflections of light from plant leaves, providing valuable insights into the physical and biochemical properties of the vegetation. These observations were then compared to the timing of plant life cycle events, known as phenology. By integrating the data from satellite imagery and phenological observations, the team has been able to obtain comprehensive information about plant functional traits across high dimensions. This integration holds great potential for extending to other dimensions of plant characteristics, such as plant health, functioning, and resilience.

This method underwent thorough and rigorous testing to evaluate its efficacy, applicability across different scales, and potential for high-throughput monitoring. The test utilised benchmark data of 12 foliar traits collected from 14 geographically distant sites within the National Ecological Observatory Network (NEON) in the eastern United States.

Shuwen LIU, the first author and a PhD candidate from Professor Wu’s lab, stated: "Our approach effectively captures the diversity of plant traits at fine spatial scales while maintaining accuracy over large areas." Liu further explained that their method overcomes the limitations of other methods that rely solely on plant functional types or single image acquisitions.

The proposed approach outperformed traditional methods that rely on environmental variables or single Sentinel-2 images as predictors without requiring environmental variables to enhance predictive capabilities. This finding underscores the significance of phenological information in trait prediction and suggests that the ‘leaf economics spectrum’ theory may be the underlying mechanism driving their technical success. Given the model's proven effectiveness in 14 diverse ecosystem sites across the United States, it shows great promise for expansion to national and global scales, thereby enabling the monitoring of plant functional traits from ecosystem to regional and national levels.

Reflecting on the future potential of this research, Professor Wu said: "Future studies will focus on broader validation to fully exploit this technology’s potential in frontier basic science, such as understanding terrestrial ecosystems’ sensitivity response to climate change and identifying their respective tipping points. Additionally, there is great potential for applied science, particularly in exploring nature-based climate solutions."

About the research team
The Global Ecology and Remote Sensing (GEARS) lab at HKU aims to uncover the fundamental mechanisms that regulate vegetation-climate interactions across various scales, ranging from leaves to the global level. It employs a diverse range of tools, including cutting-edge geospatial techniques, field observations, eco-evolutionary and ecophysiological theories, earth system models, and high-performance computing. Its research goals are twofold: firstly, to advance fundamental science by exploring the mechanisms that link climate, species (functional) composition, and ecosystem processes, and secondly, to bridge the gap between scientific and technological advancements in order to address pressing environmental issues related to climate change, such as forest health monitoring, food security, climate change impact assessments, and nature-based climate change mitigation. About GEARS: https://wu-jin.weebly.com/

About Professor Jin Wu
Jin Wu is an Assistant Professor at HKU School of Biological Sciences and a recipient of the NSFC-Excellent Young Scholar (Hong Kong & Macau) award in 2019. Prior to this, he held a Goldhaber Distinguished Fellow position at Brookhaven National Laboratory and earned his PhD from the University of Arizona. With a wide range of interests in biodiversity, conservation, global change, and sustainability sciences, he utilises an integrated approach (combining remote sensing, AI, and domain knowledge) to study these topics and aims to enhance how people experience, understand, and appreciate our living habitats and inspire actions to sustain our natural ecosystems. He has published over 100 peer-reviewed papers, including in prestigious journals such as Science, Nature, Global Change Biology, and Remote Sensing of Environment. Currently, he serves as an Associate Editor for Remote Sensing in Ecology and Conservation.

Link to the paper and key figure:
The journal paper, entitled ‘Spectra-phenology integration for high-resolution, accurate, and scalable mapping of foliar functional traits using time-series Sentinel-2 data’, can be found at the following link: https://doi.org/10.1016/j.rse.2024.114082

For media enquiries, please contact Ms Casey To, External Relations Officer (tel: 3917 4948; email: caseyto@hku.hk / Ms Cindy Chan, Assistant Director of Communications of HKU Faculty of Science (tel: 3917 5286; email: cindycst@hku.hk).


Land cover (a) and functional trait maps produced from Satellite images. The team used four traits - LMA (b), nitrogen (c), potassium (d) and chlorophyll a+b (e) - as examples for demonstration. 

CREDIT

Figures adapted from Remote Sensing of Environment, 2024, doi.org/10.1016/j.rse.2024.114082.


Technical Trials for Easing the (Cosmological) Tension



A new study sorts through models attempting to solve one of the major challenges of contemporary cosmic science, the measurement of its expansion


SISSA MEDIALAB

The CMB at different resolutions 

IMAGE: 

COMPARISON BETWEEN CMB DATA RESOLUTION COLLECTED BY PLANCK  AND SPT-3G

view more 

CREDIT: THE SOUTH POLE TELESCOPE: HTTPS://POLE.UCHICAGO.EDU/PUBLIC/HOME.HTML




Thanks to the dizzying growth of cosmic observations and measurement tools and some new advancements (primarily the “discovery” of what we call dark matter and dark energy) all against the backdrop of General Relativity, the early 2000s were a time when nothing seemed capable of challenging the advancement of our knowledge about the cosmos, its origins, and its future evolution.
Even though we were aware there was still much to uncover, the apparent agreement between our observations, calculations, and theoretical framework was indicating that our knowledge of the universe was set to grow significantly and without interruption.

However, thanks to increasingly sophisticated observations and calculations, the emergence of an apparently small “glitch” in our understanding of the Universe proved capable of jamming seemingly perfectly oiled gears. At first, it was thought it could be resolved it with even more precise calculations and measurements, but this was not the case. The "cosmological tension" (or Hubble Tension), is a discrepancy between the two ways in which we calculate the so-called Hubble parameter, H0, which describes the universe's expansion. 

The Hubble parameter can be calculated following two paths: 

  • The astrophysical observations of celestial bodies defined as local, i.e., not very far from us: it is possible to calculate the speed at which bodies at different distances are moving away. The expansion and H0 in this case is calculated by comparing speeds and distances.
  • The calculations based on data from the cosmic microwave background CMB, a faint and extremely distant radiation dating back to the very early Universe. The information we gather at that distance allows us to calculate the Universe's expansion rate and the Hubble parameter.

These two sources provided not exactly equal, but very close and consistent values of H0, and at the time it seemed that the two methods were showing good agreement. Bingo. 

It was around 2013 when we realized that the "numbers didn't add up". “The discrepancy that emerged might seem small, but given that the error bars on both sides are becoming much smaller, this separation between the two measurements is becoming large”, Khalife explains. The initial two values of H0, in fact, were not too precise, and as the “error bars” were large enough to overlap, there was hope that future finer measurements would finally coincide. “Then the Planck experiment came along, giving very small error bars compared to the previous experiments” but still maintaining the discrepancy, dashing hopes for an easy resolution. 

Planck was a satellite launched in space in 2007 to gather an image of the CMB as detailed as never before. Its results released a few years later confirmed the discrepancy was real and what was a moderate concern turned into a significant crisis. In short: the most recent and near sections of the universe we observe tell a different story, or rather seem to obey a different physics, than the oldest and most distant ones, a very unlikely possibility.

If it's not a problem of measurements then it could be a flaw in the theory, many thought. The current accepted theoretical model is called ΛCDM. ΛCDM is largely based on General Relativity - the most extraordinary, elegant, and repeatedly observationally confirmed theory about the universe formulated by Albert Einstein more than a century ago - and takes into account dark matter (interpreted as cold and slow-moving) and dark energy as a cosmological constant.

Over the last years, various alternative models or extensions to the ΛCDM model have been proposed, but so far, none have proven convincing (or sometimes even trivially testable) in significantly reducing the "tension". “It is important to test these various models, see what works and what can be excluded, so that we can narrow the path or find new directions to turn to”, explains Khalife. In their new paper, he and his colleagues on the basis of previous research lined up 11 of these models, bringing some order to the theoretical jungle that has been created. The models were tested with analytical and statistical methods on different sets of data, both from the near and distant universe, including the most recent results from the SH0ES (Supernova H0 for the Equation of State) collaboration and SPT-3G (the new upgraded camera of the South Pole Telescope, collecting the CMB). 

Three of the selected models that were shown in previous works to be viable solutions were ultimately excluded by the new data this research considers. On the other hand, other three models still seem capable of reducing the tension, but this doesn’t solve the problem. “We found that those could reduce the tension in a statistically significant way, but only because they have very large error bars and the predictions they make are too uncertain for the standards of cosmology research”, says Khalife. “There is a difference between solving and reducing: these models are reducing the tension from a statistical point of view, but they're not solving it”, meaning that none of them is predicting a large value of H0 from CMB data alone. More in general none of the models tested proved superior to the others studied in this work in reducing the tension.

“From our test we now know which are the models that we should not look at to solve the tension,” concludes Khalife, “and we also know the models that we might be looking at in the future”. This work could be a base for the models that will be developed in the future, and by constraining them with increasingly precise data, we could move closer to developing a new model for our Universe.

 

Scientists discover new way to extract cosmological information from galaxy surveys




CHINESE ACADEMY OF SCIENCES HEADQUARTERS
An illustration of the main idea and results 

IMAGE: 

AN ILLUSTRATION OF THE MAIN IDEA AND RESULTS FROM WANG ET AL., COMMUN. PHYS. 7, 130 (2024)

view more 

CREDIT: CREDIT TO NAOC




Scientists at the National Astronomical Observatories of the Chinese Academy of Sciences (NAOC) and their international collaborators have recently developed a new method for efficiently extracting information from galaxy surveys.

Their research results were published online in the latest issue of Communications Physics.

Massive galaxy redshift surveys are powerful tools for probing the Universe in this era of precision cosmology. By observing a great number of spectra from distant galaxies, astronomers are able to create density fields of galaxies at different epochs of the Universe. These density fields carry crucial information about the clustering of galaxies, which is quantified by two-point and N-point (N>2) correlation functions.

“The information content in the N-point functions is highly complementary to that in the two-point functions,” said ZHAO Gongbo, lead author of the study and a researcher at NAOC. “The N-point functions play an important role in studies of the nature of dark energy, dark matter and gravity.”

However, it is difficult to make use of the N-point functions in practice due to various complexities, including the measurement and modeling of these quantities.

After working on this challenging task for a few years, ZHAO and his collaborators have developed a new method for extracting information in the N-point functions from the two-point functions.

This new method, which is based on a technology called density reconstruction, makes it possible to extract the primary information in the three-point and four-point functions by a joint analysis of the two-point functions measured from the pre- and post-reconstructed density fields, respectively.

“This opens a new window for using the high-order information in galaxy surveys in an efficient way,” said ZHAO. “and that’s important for cosmological implications for forthcoming galaxy surveys including Dark Energy Spectroscopic Instrument (DESI), Prime Focus Spectrograph (PFS) and China Space Station Telescope (CSST).”

This work was funded by the Natural Science Foundation of China (NSFC), China’s Ministry of Science and Technology (MOST), and the Chinese Academy of Sciences (CAS).

 

Which countries are more at risk in the global supply chain?



Developing and poor nations are more vulnerable to supply chain disruptions than wealthy countries, according to a study out of the Complexity Science Hub



COMPLEXITY SCIENCE HUB





[Vienna, April 19 2024] — Using firm-level data from the global supply network, researchers from the Complexity Science Hub (CSH) quantified countries' exposure to production losses caused by firm defaults in other countries. According to their findings, wealthy nations are only exposed to supply chain disruptions from other high-income countries, while poor and developing nations are exposed to shocks from all countries.

“Our data comes from Standard & Poor's Capital IQ platform, which contains information on most of the world's largest and most important companies. Around 230,000 companies in 206 countries are represented in this data, which provides a good picture of the global supply chain network,” explains CSH scientist Tobias Reisch

“Data on almost 1 million corporate relationships is included, detailing the flow of goods and services between countries,” adds Reisch, one of the lead authors of the study published in Nature Communications.

Simulation of economic shocks

The researchers wanted to know what would happen in the event of a supply chain disruption – whether it be a transportation infrastructure problem, like the collapse of the Baltimore Bridge; or a natural disaster, such as an earthquake in Taiwan, among others. They then simulated economic shocks in the networks – interruptions in the flow of goods and services – and observed how they propagated within the networks.

“By studying of how a complete interruption of a firm would spread across the global supply network, we discovered that high-income countries create significant exposures beyond their regions and thus export systemic risk“, says Stefan Thurner, senior author of the study and CSH president. In contrast, “low-income countries are disproportionately strongly affected by high exposure values”.

Not as expected 

“We initially thought the economic shocks would affect more rich and industrialized countries since they are more involved in global value chains. However, this was not the case. They receive less economic shocks, but create more shocks,” highlights Reisch. “In some ways, these countries seem more diversified, or at different positions in the supply network. In fact, they are exposing other countries more than they are exposed.”

The study’s results also reveal that exposure to other nations is highly structured on a regional level. Therefore, companies within a country are most vulnerable to shocks within their own borders. "This indicates the typically strong embedding of firms within their local or national supply chains. The same applies to regions as well: African companies are closer to others located in Africa, and European firms have closer ties to those located on the Old Continent,” explains Reisch.

Structural inequality

The findings suggest structural inequality in supply networks between countries is significant, according to the CSH researchers. “Since exposure inequality arises from the structure of the global supply network on the firm level, it is important to understand the processes that let firms from different income countries enter into production and trade relations and how this could happen by creating less risk exposure to poorer countries,” propose the authors.

“A possible strategy to make supply chains more resilient, fair and sustainable at the same time could be the introduction of a ‘systemic risk tax’ for international supply networks. There one could follow ideas we developed earlier for making financial markets more resilient. However, the situation for supply chains is more complicated and it will need more research to get the details right how such a taxing scheme could look like,” saysThurner.

The authors of the study urge a global effort to collect and monitor granular economic data way better than what was available in the study. Only with better data, researchers and policymakers will be able to track the spreading of supply chain threats around the world so that it becomes actually usable und helpful for individual companies. This would allow firms and governments alike to anticipate and prepare for globally spreading supply shocks, according to the CSH researchers. 


About the study

The study "Inequality in economic shock exposures across the global firm-level supply network," by Abhijit Chakraborty, Tobias Reisch, Christian Diem, Pablo Astudillo-Estévez, and Stefan Thurner has been published in Nature Communications (doi: https://doi.org/10.1038/s41467-024-46126-w)

About CSH

The Complexity Science Hub (CSH) is Europe’s research center for the study of complex systems. We derive meaning from data from a range of disciplines – economics, medicine, ecology, and the social sciences – as a basis for actionable solutions for a better world. Established in 2015, we have grown to over 70 researchers, driven by the increasing demand to gain a genuine understanding of the networks that underlie society, from healthcare to supply chains. Through our complexity science approaches linking physics, mathematics, and computational modeling with data and network science, we develop the capacity to address today's and future challenges.

 

Scientists trigger mini-earthquakes in the lab




UNIVERSITEIT VAN AMSTERDAM





Earthquakes and landslides are famously difficult to predict and prepare for. By studying a miniature version of the ground in the lab, scientists at the UvA Institute of Physics have demonstrated how these events can be triggered by a small external shock wave. Bring a flotation device: it involves the ground briefly turning into a liquid!

Unlike a true solid, the ground we stand on is generally made of granules such as sand grains or pieces of rock. Deeper down in Earth’s crust, the same holds for the fault lines where two tectonic plates meet. These types of disordered granular materials are never fully stable. And when they fail, it can have catastrophic effects for us, living on Earth’s surface.

The trouble is: it is not easy to predict or control when exactly the friction forces resisting a landslide or earthquake will stop being enough to keep the ground in place. Thankfully, the physics works exactly the same in smaller systems that you can study in the lab. To reproduce an earthquake, physicists Kasra Farain and Daniel Bonn of the University of Amsterdam used a 1-mm thick layer of tiny spheres that are each the width of a human hair.

Their experimental setup allowed them to keep precise track of the granules’ response to external forces. To simulate the forces that would be present on a steep mountain slope or at a tectonic fault, they pressed a disc on the surface and slowly rotated it with a constant speed. By subsequently bouncing a ball next to the experimental setup, triggering a small seismic wave, they saw how all the granules rapidly shifted in response: they had triggered a miniature earthquake!

“We found that a very small perturbation, a small seismic wave, is capable of causing a granular material to completely restructure itself,” explains Farain. Further examination revealed that for a brief moment, the granules behave like a liquid rather than a solid. After the triggering wave has passed, friction takes over once more and the granules get jammed again, in a new configuration.

The same happens in real seismic events. “Earthquakes and tectonic phenomena follow scale-invariant laws, so findings from our laboratory-scale frictional setup are relevant for understanding remote earthquake triggering by seismic waves in much larger-scale faults in the Earth’s crust,” says Farain.

The researchers show that the mathematical model they deduced from their experiments quantitatively explains how the 1992 Landers earthquake in Southern California remotely triggered a second seismic event, 415 km to the north. In addition, they show that their model accurately describes the rise in fluid pressure observed in the Nankai subduction zone near Japan after a series of small earthquakes in 2003.

Inspired by a shaky table
Interestingly, this entire research project might not have come to fruition if it weren’t for Farain’s colleagues: “Initially, my experimental setup was just on a regular table, lacking all the fancy vibration isolation needed for precise measurements. Soon enough, I realised that simple things like someone walking by or the door closing could affect the experiment. I must have been a bit of a bother to my colleagues, always asking for quieter footsteps or gentler door closures.”

Inspired by how his colleagues’ movements disrupted his setup, Farain began to investigate the physics at work: “After some time, I upgraded to a proper optical table for the setup, and people could jump, or do whatever they wanted without disrupting my work. But, true to my troublemaking tendencies, that wasn't the end of it. A little while later, I returned to the lab with a loudspeaker to generate noise and see the effects of controlled perturbations!”

 

Some plant-based steaks and cold cuts are lacking in protein


JUST LIKE THE REAL THING!


Peer-Reviewed Publication

AMERICAN CHEMICAL SOCIETY

Some plant-based steaks and cold cuts are lacking in protein 

IMAGE: 

MEAT PRODUCTS (TOP LEFT, VEAL; BOTTOM LEFT, BRESAOLA) TEND TO CONTAIN MORE PROTEINS AND AMINO ACIDS THAN THEIR PLANT-BASED ALTERNATIVES (RIGHT).

view more 

CREDIT: ADAPTED FROM JOURNAL OF AGRICULTURAL AND FOOD CHEMISTRY 2024, DOI: 10.1021/ACS.JAFC.3C08956





Many plant-based meats have seemingly done the impossible by recreating animal products ranging from beef to seafood. But beyond just the taste and texture, how do these products compare to the real thing in nutritional value? A small-scale study published in ACS’ Journal of Agricultural and Food Chemistry shows that while some “plant steaks” and “plant cold cuts” might be comparable to meats on some fronts, their amino acid content and protein digestibility fall short.

Meat-free burgers or ground beef mimics might come to mind first, but the options for plant-based alternatives have expanded to include whole cuts of meat resembling steaks and chicken breasts, as well as sliced cold cuts like salami or bresaola — a type of cured beef. While these newer products haven’t been studied as extensively as burger-style products, they are becoming more widespread and popular among consumers. As a result, it’s important to understand how they differ nutritionally from the meats they aim to replicate and replace. In other words, how well do our bodies digest and gain nutrition from these foods? Tullia Tedeschi and colleagues wanted to answer that question by comparing the protein quality, integrity and digestibility of a set of plant-based steaks and cold cuts to their meat counterparts.

The team, based in Italy, collected three different plant-based steaks and three different plant-based cold cuts. Veal steaks were used as a comparison point for the plant steaks, whereas ham and beef cold cuts were compared to their respective plant-based substitutes. The fat, salt and protein content of each was measured, then the samples underwent a simulated digestion in the lab to understand how well the proteins break down in a human’s digestive tract.

  • The plant-based products contained more carbohydrates, less protein and reduced amino acid content than their meat-based counterparts.
  • Plant steaks and the veal samples were comparable in terms of essential amino acid content and digestibility.
  • Plant cold cuts generally had less salt than the meats and contained fewer essential amino acids. Different products also showed differing levels of digestibility due to the variety of ingredients they contain.  

Overall, the nutritional value of the plant-based products depended greatly on the plants used to create them, causing wide variation in their amino acid content and the digestibility of their proteins. In contrast, all the samples within a particular meat type showed comparable nutritional profiles. The researchers say that this work helps demonstrate that careful consideration should be taken when replacing meat products with plant-based alternatives, and that these differences in nutritional profile should be communicated to consumers to allow for informed decisions.

The authors acknowledge funding from the Emilia Romagna Region of Italy for this work.

###

The American Chemical Society (ACS) is a nonprofit organization chartered by the U.S. Congress. ACS’ mission is to advance the broader chemistry enterprise and its practitioners for the benefit of Earth and all its people. The Society is a global leader in promoting excellence in science education and providing access to chemistry-related information and research through its multiple research solutions, peer-reviewed journals, scientific conferences, eBooks and weekly news periodical Chemical & Engineering News. ACS journals are among the most cited, most trusted and most read within the scientific literature; however, ACS itself does not conduct chemical research. As a leader in scientific information solutions, its CAS division partners with global innovators to accelerate breakthroughs by curating, connecting and analyzing the world’s scientific knowledge. ACS’ main offices are in Washington, D.C., and Columbus, Ohio.

To automatically receive news releases from the American Chemical Society, contact newsroom@acs.org.

Note: ACS does not conduct research, but publishes and publicizes peer-reviewed scientific studies.

Follow us: X, formerly Twitter | Facebook | LinkedIn | Instagram


Toxic chemicals from microplastics can be absorbed through skin



UNIVERSITY OF BIRMINGHAM





Toxic chemicals used to flame-proof plastic materials can be absorbed into the body through skin, via contact with microplastics, new research shows. 

The study offers the first experimental evidence that chemicals present as additives in microplastics can leach into human sweat, and then be absorbed through the skin, into the bloodstream. 

Many chemicals used as flame retardants and plasticisers have already been banned, due to evidence of adverse health effects including damage to the liver or nervous system, cancer, and risks to reproductive health. However, these chemicals are still present in the environment in older electronics, furniture, carpets, and building materials.  

While the harm caused by microplastics is not fully understood, there is increasing concern over their role as conduits of human exposure to toxic chemicals.  

The research team demonstrated in a study published last year, that chemicals were leached from microplastics into human sweat. The current study now shows that those chemicals can also be absorbed from sweat across the skin barrier into the body. 

In their experiments, the team used innovative 3D human skin models as alternatives to laboratory animals and excised human tissues. The models were exposed over a 24-hour period to two common forms of microplastics containing polybrominated diphenyl ethers (PBDEs), a chemical group commonly used to flame retard plastics.  

The results, published in Environment International, showed that as much as 8% of the chemical exposed could be taken up by the skin, with more hydrated -- or ‘sweatier’ -- skin absorbing higher levels of chemical. The study provides the first experimental evidence into how this process contributes to levels of toxic chemicals found in the body. 

Dr Ovokeroye Abafe, now at Brunel University, carried out the research while at the University of Birmingham. He said: “Microplastics are everywhere in the environment and yet we still know relatively little about the health problems that they can cause. Our research shows that they play a role as ‘carriers’ of harmful chemicals, which can get into our bloodstream through the skin. These chemicals are persistent, so with continuous or regular exposure to them, there will be a gradual accumulation to the point where they start to cause harm.” 

Dr Mohamed Abdallah, Associate Professor of Environmental Sciences at the University of Birmingham, and principal investigator for the project, said: “These findings provide important evidence for regulators and policymakers to improve legislation around microplastics and safeguard public health against harmful exposure.” 

Professor Stuart Harrad, co-author of the paper, added “the study provides an important step forward in understanding the risks of exposure to microplastics on our health. Building on our results, more research is required to fully understand the different pathways of human exposure to microplastics and how to mitigate the risk from such exposure.”   

In future research, the team plan to investigate other routes through which microplastics could be responsible for toxic chemicals entering the body, including inhalation and ingestion. The work is funded by a Marie Curie Research Fellowship, within the European Union’s Horizon 2020 Research and Innovation Programme. 

 

 

New research defines specific genomic changes associated with the transmissibility of the monkeypox virus




THE MOUNT SINAI HOSPITAL / MOUNT SINAI SCHOOL OF MEDICINE





Mount Sinai scientists, in collaboration with researchers from the Carlos III Health Institute (ISCIII) in Madrid, Spain, have located and identified alterations in the monkeypox virus genome that potentially correlate with changes in the virus’s transmissibility observed in the 2022 outbreak. The findings were published April 18 in Nature Communications.

Monkeypox virus (MPXV) is a double-stranded DNA virus that can infect animals and humans. MPXV causes a disease known as mpox, with symptoms that include fever, swollen lymph nodes, and a rash. Most cases are mild and tend to get better on their own; however, mpox can be very painful and may lead to permanent scarring. First encountered in 1958 in crab-eating macaque monkeys imported to Belgium, MPXV has caused sporadic human disease outbreaks since the 1970s in Central and Western Africa. In May 2022, multiple countries, including the United States, reported an increasing number of MPXV infections and associated disease, including clusters in cases potentially linked to super-spreading events in Belgium, Spain, and the United Kingdom. While the number of new cases associated with the 2022 spillover has decreased over time, cases of the disease are still occurring among unvaccinated individuals, including a current increase in Central Africa due to a new spillover. As the virus’s circulation in humans increases, the risk of emergence of a more transmissible variant capable of becoming endemic in the human population increases.

“Biopreparedness and virological surveillance involves studying the causes that favor zoonotic spillover and facilitates human-to-human transmission. When we observe significant changes in basic epidemiological features of a viral agent like monkeypox, it should reinvigorate our interest in understanding those transmission conditions. The increasing number of cases currently happening in Africa, and the 2022 epidemic, should be clear alert signals,” says Gustavo Palacios, PhD, Professor of Microbiology at the Icahn School of Medicine at Mount Sinai and a senior author of the study.

To carry out the study, researchers analyzed samples from 46 patients infected with MPXV whose diagnosis and sequencing were carried out at the ISCIII at the beginning of the 2022 mpox outbreak. The team performed high-quality sequencing of each study participant’s complete monkeypox virus genome to determine possible correlations between genomic variations in the different groups of sequences and epidemiological links associated with the virus’s ability to evolve, transmit, and infect.

According to the research team, recurrent observed genomic changes were located in areas of the genome that could be related to viral adaptation. Those specific locations would contribute to modulating the viral replication cycle, adaptability, and path of entry and egress. These alterations appear in areas known as low complexity genomic regions, which are particularly difficult to sequence and analyze, explaining why they were overlooked before. This highly sophisticated complete genome sequencing was made possible through the use of two advanced sequencing technologies: single-molecule long-read sequencing (to cover highly repetitive regions) and deep short sequencing reads (to provide accuracy and depth).

By detailing the genomic alterations within these repetitive genomic sequences and linking them to critical viral functions, researchers provide a plausible explanation for the heightened transmissibility observed during the 2022 mpox outbreak.

“These findings might be offering the first hints to help us understand the unique features of the strains associated with sustained human-to-human transmission, which has not ever been observed in these agents,” says Dr. Palacios. “Better understanding of the doors that facilitate transmission of viral agents and impact their clinical presentations will enable us to develop more effective prevention and treatment strategies.”

About the Mount Sinai Health System
Mount Sinai Health System is one of the largest academic medical systems in the New York metro area, with more than 43,000 employees working across eight hospitals, more than 400 outpatient practices, more than 600 labs, a school of nursing, and a leading school of medicine and graduate education. Mount Sinai advances health for all people, everywhere, by taking on the most complex healthcare challenges of our time—discovering and applying new scientific learning and knowledge; developing safer, more effective treatments; educating the next generation of medical leaders and innovators; and supporting local communities by delivering high-quality care to all who need it. Through the integration of its hospitals, labs, and schools, Mount Sinai offers comprehensive healthcare solutions from birth through geriatrics, leveraging innovative approaches such as artificial intelligence and informatics while keeping patients’ medical and emotional needs at the center of all treatment. The Health System includes approximately 9,000 primary and specialty care physicians and 11 free-standing joint-venture centers throughout the five boroughs of New York City, Westchester, Long Island, and Florida. Hospitals within the System are consistently ranked by Newsweek’s® “The World’s Best Smart Hospitals, Best in State Hospitals, World Best Hospitals and Best Specialty Hospitals” and by U.S. News & World Report's® “Best Hospitals” and “Best Children’s Hospitals.” The Mount Sinai Hospital is on the U.S. News & World Report® “Best Hospitals” Honor Roll for 2023-2024. For more information, visit https://www.mountsinai.org or find Mount Sinai on FacebookTwitter and YouTube.

###