Friday, June 25, 2021

Are zebra mussels eating or helping toxic algae?

MICHIGAN STATE UNIVERSITY

Research News

While invasive zebra mussels consume small plant-like organisms called phytoplankton, Michigan State University researchers discovered during a long-term study that zebra mussels can actually increase Microcystis, a type of phytoplankton known as "blue-green algae" or cyanobacteria, that forms harmful floating blooms.

"Microcystis literally means small cell, but numerous cells cluster together in colonies that can float to the surface to form scums," said Orlando Sarnelle, a professor emeritus with the Department of Fisheries and Wildlife within the College of Agriculture and Natural Resources. "It is one of the most common causes of nuisance algal blooms in nutrient-enriched waters, including Lake Erie where it is a concern for municipal water supplies."

In the 1990s, researchers observed the appearance of dime-sized zebra mussels in Gull Lake, Michigan. Shortly after the mussels arrived, the researchers noticed an increase in Microcystis, which was surprising because the lake has low levels of phosphorus and Microcystis has a well-documented need for high-nutrient waters.

"Lakes colonized by zebra mussels tend to have about three times more Microcystis," said Stephen Hamilton, a professor at the W.K. Kellogg Biological Station and the Department of Integrative Biology within the College of Natural Science, who was also curious to see if there was a relationship between the Microcystis and zebra mussels.

"We observed that zebra mussels can filter out the Microcystis with other particles, but then they spit out the Microcystis because evidently it is unpalatable to them," Hamilton said.

Sarnelle collaborated with Hamilton on a multiyear study that was part of the National Science Foundation's Long-Term Ecological Research Network. Forty years ago, the NSF recognized the need for research studies that lasted more than a few years and launched the LTER Network. This study is one of five projects highlighted in a recent issue of the Ecological Society of America's journal, Ecosphere.

"Long-term measurements are essential to our understanding of many ecological phenomena," Sarnelle said. "There are many things you can't answer in the typical two- to four-year grant cycle."

The researchers suspected the zebra mussels were consuming competitors of Microcystis, which paved the way for the cyanobacteria to flourish under lower nutrient availability than it usually needs. In 2010, an unexpected summer die-off of zebra mussels in Gull Lake during prolonged warm temperatures provided a whole-lake test of the relationship, an opportunity that scientists sometimes call a "natural experiment."

"Normally, Microcystis thrives in warmer water," said Jeffrey White, who was a graduate student advised by Sarnelle at the time and is now a faculty member at Framingham State University in Framingham, Massachusetts. "Instead, we saw an 80% decrease in the Gull Lake Microcystis population when the zebra mussels died despite optimal temperatures for its growth."

The researchers were able to use the long-term study data to confirm their hypothesis.

"This fortuitous observation following years of sampling strengthens the argument that there is a cause-and-effect relationship, and not just a correlation, between zebra mussels and increased Microcystis," Hamilton said. "Multiyear studies can catch slow, unusual or extreme events that could be making important changes resulting in long-term lasting effects in the ecosystems."

###

 

Spreading of infections = need for collaboration between biology and physics

UNIVERSITY OF COPENHAGEN - FACULTY OF SCIENCE

Research News

IMAGE

IMAGE: THE FIGURE SHOWS A MODEL OF A SOCIAL NETWORK. THERE ARE 150 INDIVIDUALS (THE DOTS), WHOSE SOCIAL CONNECTIONS ARE MARKED BY THE LINES BETWEEN THEM. THERE ARE THREE CATEGORIES: 1.... view more 

CREDIT: BJARKE FROST NIELSEN/KIM SNEPPEN

Researchers at the Niels Bohr Institute, University of Copenhagen, together with epidemiologist Lone Simonsen from Roskilde University form part of the panel advising the Danish government on how to tackle the different infection-spreading situations we have all seen unfold over the past year. Researchers have modelled the spread of infections under a variety of scenarios, and the Coronavirus has proven to not follow the older models of disease spreading. An increasingly varied picture of its behaviour and thus its impact on society has emerged. In several scientific articles, researchers have described the knowledge accrued to date, most recently around the concept of "superspreaders". It turns out that only approximately 10% of those infected account for roughly 80% of the spread of the infection. The results have been published in the scientific journal Proceedings of the National Academy of SciencesPNAS.

Where does our knowledge of infections spreading stem from?

The data researchers use to" feed" and develop computer models comes from a wide range of different sources. The Danish municipalities have kept inventories of the spread of the infection, and this data has the advantage that it stems from units that are not overly large. There is a high degree of detail and this means that one can trace local development more clearly and thus construct parameters for superspreading, which Postdoc Julius Kirkegaard has contributed to. Contact tracing is another source of information. In that case, the focus is on localising and limiting the individual's transmission of the virus. The third source is slightly more complicated as it seeks to follow the chain of infections via the gene sequence of the virus.

Who are the superspreaders?

Regardless of which source researchers examine, the results deliver roughly the same: 10% of all those infected account for as much as 80% of the spread of the infection. It is therefore crucial, in relation to the spread of the virus to locate the so-called superspreaders and uncover how superspreading occurs. Researchers stress that, at the moment, we are not quite sure what constitutes a person as a superspreader. It may purely be down to personal, physiological characteristics. In addition, there are varying degrees of superspreading in the population, so it is not necessarily just one or the other. Some people simply spread the virus more than others and the variation from persons with almost no transmission to superspreaders is great.

How do researchers model a population of just under 6 million individuals?

Three basic categories are considered important when modelling the population's behaviour, when calculating a scenario for the spread of infection: 1. The family context, 2. Work context and 3. The random contexts people find themselves in - in other words, people in proximity on public transport, at leisure activities etc. The time factor in all three is crucial, as it takes time to infect other people. In terms of time, these three categories are somewhat identical when it comes to common diseases, but not a superpreader coronavirus variant.

But this is where the individual characteristics of the virus come into play: Superspreaders are quite different when handled in a computer model. Methods known from physics become important here, as it is necessary to model individuals and their contacts. Researchers have set up computer models both for scenarios with and without superspreaders, and it transpires that shutting down workspaces as well as sporting events, and public transport has the same effect when the model does not take superspreaders into account. But when we include superspreaders, there is a pronounced difference, and the shutdown of public events has a much greater effect.

Disease modelling faces new challenges and strong interdisciplinary collaboration

Diseases can behave very differently and it is therefore incredibly important to be both ready and capable of rapid change in relation to the development of new models that reflect the characteristics of different diseases as accurately as possible, if we hope to contain them. Professor Kim Sneppen explains:" The biological variation of different viruses is enormous. SARS-CoV-2 contains a special feature in that it is at its most contagious just before one develops symptoms. This is the exact opposite of an earlier disease that threatened to become a pandemic, namely SARS, which is mostly contagious after one displays symptoms. Viruses are extremely advanced machines that each find specific weak points to exploit. A new field of research is rapidly developing, which examines how viruses attack the cells in our body. COVID-19 has proven to lead to very different sickness progressions for different patients. In that senses, it behaves chaotically, as we say in physics.".

Ph.D. student Bjarke Frost Nielsen and Professor Kim Sneppen see a large open field of research within the collaboration between physics and biology. Gathering as much possible information about different viruses is crucial thus enabling physists to deploy this knowledge in mapping scenarios to respond to them.

The potential for research into the spread of infections is great

Bjarke Frost Nielsen says:" We need to create a toolbox that contains a wide variation in the way we tackle the spread of transmission, in our computer programs. This is the immediate perspective we can see in front of us, at the moment. Mathematical disease modelling has been around for almost 100 years, but unfortunately not a lot of headway has been made over that period. To put it bluntly, the same equations from the 1930's are still in use today. In relation to some diseases, they can be correct, but in relation to others they can be way off. This is where, as physicists, we have a completely different approach. There are numerous parameters, i.e., social dynamics and much more varied interactions between individuals that we can build our scenarios upon. This is badly needed, when we see the enormous variations in the different diseases".

###

New findings unveil a missing piece of human prehistory

CHINESE ACADEMY OF SCIENCES HEADQUARTERS

Research News

IMAGE

IMAGE: GEOGRAPHICAL AND TEMPORAL DISTRIBUTION OF NEWLY SAMPLED INDIVIDUALS view more 

CREDIT: IVPP

A joint research team led by Prof. FU Qiaomei from the Institute of Vertebrate Paleontology and Paleoanthropology (IVPP) of the Chinese Academy of Sciences sequenced the ancient genomes of 31 individuals from southern East Asia, thus unveiling a missing piece of human prehistory.

The study was published in Cell on June 24.

Prof. FU's team used DNA capture techniques to retrieve ancient DNA from Guangxi and Fujian, two provincial-level regions in southern China. They sequenced genome-wide DNA from 31 individuals dating back 11,747 to 194 years ago. Of these, two date back to more than 10,000 years ago, making them the oldest genomes sampled from southern East Asia and Southeast Asia to date.

Previous ancient DNA studies showed that ~8,000-4,000-year-old Southeast Asian Hòabìnhian hunter-gatherers possessed deeply divergent Asian ancestry, whereas the first Southeast Asian farmers beginning ~4,000 years ago show a mixture of ancestry associated with Hòabìnhian hunter-gatherers and present-day southern Chinese populations. In coastal southern China, ~9,000-4,000-year-old individuals from Fujian province show ancestry not as deeply divergent as the Hòabìnhian.

In Guangxi, FU and her team's sampling showed that the ancestry present was unlike that sampled previously in Fujian and Southeast Asia. Instead, they found a unique East Asian ancestral population (represented by the 11,000-year-old Longlin individual from Guangxi). Their findings highlight that 11,000 years ago, at least three genetically distinct ancestries composed the human landscape in southern East Asia and Southeast Asia: Fujian ancestry, Hòabìnhian ancestry, and Guangxi ancestry.

In addition to sharing Longlin ancestry, the Dushan and Baojianshan individuals in Guangxi also show strong evidence for admixture in southern China ~9,000 to 6,000 years ago. Dushan and Baojianshan were a mixture of local Guangxi ancestry, southern ancestry previously sampled in Fujian, and Deep Asian ancestry related to Southeast Asian Hòabìnhian hunter-gatherers.

Previously, it was shown that southern Chinese populations expanded to Southeast Asia, mixing with and eventually replacing Hòabìnhians in Southeast Asia. FU's team showed that the dynamics were more complex, since populations carrying Hòabìnhian ancestry either co-existed with populations carrying Guangxi ancestry in southern China or gene flow upwards from Southeast Asia to southern China also occurred as early as ~8,000-6,000 years ago.

The study fills a research gap in the region connecting East and Southeast Asia, revealing a new genetic ancestry different from that found in coastal areas of southern China and in Southeast Asia.

Furthermore, it shows the impact of migration and admixture of populations at the crossroads of East and Southeast Asia in the last 11,000 years, revealing a long history of intermingling between these two regions.

"While we now have a better understanding of the population history in the last 11,000 years at the crossroads of East and Southeast Asia, future sampling in regions near the Yangtze River and southwest China are needed for a comprehensive understanding of the genetic history of humans in southern China," said Prof. FU.

Genetic samples from ancient humans in these regions will likely further clarify the remarkably diverse genetic prehistory of humans in southeastern Asia, and inform the genetic shifts that occurred between 6,000 and 1,500 years ago and contributed to the genetic composition observed today in southern China.


CAPTION

Overview of population dynamical history at the crossroads of East and Southeast Asia since 11,000 years ago

CREDIT

IVPP



CAPTION

Phylogenetic tree of early Asian populations

CREDIT


 

People with fibromyalgia are substituting CBD for opioids to manage pain

The cannabis-derived substance provides fewer side effects, with less potential for abuse

MICHIGAN MEDICINE - UNIVERSITY OF MICHIGAN

Research News

Fibromyalgia is one of many chronic pain conditions that remains stubbornly difficult to treat.

As the ravages of the opioid epidemic lead many to avoid these powerful painkillers, a significant number of people with fibromyalgia are finding an effective replacement in CBD-containing products, finds a new Michigan Medicine study.

CBD, short for cannabidiol, is the second most common cannabinoid in the cannabis plant, and has been marketed for everything from mood stabilization to pain relief, without the intoxicating effects produced by the most common cannabinoid, THC. THC, which stands for delta-9-tetrahydrocannabinol, is the ingredient in marijuana that causes people to feel high.

The cannabis industry has exploded, aided by the legalization of medical and recreational marijuana in states around the United States and the removal of hemp-derived CBD from Schedule 1 status--reserved for drugs with no currently accepted medical use and a high potential for abuse--at the federal level.

Previous research shows that some people substitute medical cannabis (often with high concentrations of THC) for opioids and other pain medications, reporting that cannabis provides better pain relief and fewer side effects. However, there is far less data on CBD use.

"CBD is less harmful than THC, as it is non-intoxicating and has less potential for abuse," said Kevin Boehnke, Ph.D., a research investigator in the Department of Anesthesiology and the Chronic Pain and Fatigue Research Center. "If people can find the same relief without THC's side effects, CBD may represent a useful as a harm reduction strategy."

Boehnke and his team surveyed people with fibromyalgia about their use of CBD for treatment of chronic pain.

"Fibromyalgia is not easy to treat, often involving several medications with significant side effects and modest benefits," Boehnke explained. "Further, many alternative therapies, like acupuncture and massage, are not covered by insurance."

For this study, the team focused on 878 people with fibromyalgia who said they used CBD to get more insight into how they used CBD products.

The U-M team found that more than 70% of people with fibromyalgia who used CBD substituted CBD for opioids or other pain medications. Of these participants, many reported that they either decreased use or stopped taking opioids and other pain medications as a result.

"I was not expecting that level of substitution," said Boehnke, noting that the rate is quite similar to the substitution rate reported in the medical cannabis literature. People who said they used CBD products that also contained THC had higher odds of substitution and reported greater symptom relief.

Yet the finding that products containing only CBD also provided pain relief and were substituted for pain medications is promising and merits future study, noted Boehnke.

The team noted that much of the widespread use of CBD is occurring without physician guidance and in the absence of relevant clinical trials. "Even with that lack of evidence, people are using CBD, substituting it for medication and doing so saying it's less harmful and more effective," he said.

Boehnke stressed the need for more controlled research into how CBD may provide these benefits, as well as whether these benefits may be due to the placebo effect.

Clinically, opening up lines of discussion around CBD use for chronic pain is imperative, said Boehnke, for medication safety reasons as well as for "enhancing the therapeutic alliance and improving patient care."

###

Additional authors include Joel J. Gagnier, Lynne Matallana and David A. Williams.

Paper cited: "Substituting Cannabidiol for Opioids and Pain Medications Among Individuals with Fibromyalgia: A Large Online Survey," The Journal of Pain. DOI: 10.1016/j.jpain.2021.04.011

Race, ethnicity not a factor in recent weapon-carrying behaviors at US schools

The University of Minnesota Medical School study says a school's social climate plays the strongest role when weapon-carrying behaviors increase

UNIVERSITY OF MINNESOTA MEDICAL SCHOOL

Research News

MINNEAPOLIS/ST. PAUL (06/24/2021) -- A study led by researchers at the University of Minnesota Medical School sheds new light on boys' weapon-carrying behaviors at U.S. high schools. The results indicate that weapon-carrying is not tied to students' race or ethnicity but rather their schools' social climates.

The study was published in the journal Pediatrics and led by Patricia Jewett, PhD, a researcher in the Department of Medicine at the U of M Medical School.

"Narratives of violence in the U.S. have been distorted by racist stereotyping, portraying male individuals of color as more dangerous than white males," Jewett said. "Instead, our study suggests that school climates may be linked to an increase in weapon-carrying at schools."

The study analyzed self-reported weapon-carrying behaviors among 88,000 young males at U.S. high schools between 1993 and 2019 based on data from the Youth Risk Behavior Surveillance System. From that data, they identified four key findings:

- Since 1993, weapon-carrying in schools has declined among all males.

- Over the last 20 years, in schools perceived as safer, Non-Hispanic, white males have been more likely to bring weapons into schools than Non-Hispanic Black/African American or Hispanic males.

- Between 2017 and 2019, while comparing all schools, no significant differences in weapon-carrying behaviors existed by race or ethnicity.

- More frequent weapon-carrying is associated with experiences of unsafety or violence at school. Males who experienced violence or felt unsafe at school brought weapons at least twice as often, and such negative school experiences were more common among males of color (8-12%) than among Non-Hispanic white males (4-5%).

"Our work underscores the association of experiences of unsafety at school with weapon-carrying at school and highlights large knowledge gaps in the field of gun violence research in the U.S.," Jewett said. "This is an important foundation for much needed research to disentangle the intertwined phenomena of racism, toxic environments of violence and gun- and weapon-culture in the U.S. We are currently reaching out to other researchers who work in the field to collaborate on this urgent public health topic."

###

Co-authors of the study include Iris W. Borowsky, MD, PhD, with the U of M Medical School, Eunice M. Areba, PhD, with the U of M School of Nursing; Ronald E. Gangnon, PhD, and Kristen Malecki, PhD, with the University of Wisconsin, Madison; and Judith Kafka, PhD, with the City University of New York.

About the University of Minnesota Medical School

The University of Minnesota Medical School is at the forefront of learning and discovery, transforming medical care and educating the next generation of physicians. Our graduates and faculty produce high-impact biomedical research and advance the practice of medicine. We acknowledge that the U of M Medical School, both the Twin Cities campus and Duluth campus, is located on traditional, ancestral and contemporary lands of the Dakota and the Ojibwe, and scores of other Indigenous people, and we affirm our commitment to tribal communities and their sovereignty as we seek to improve and strengthen our relations with tribal nations. For more information about the U of M Medical School, please visit med.umn.edu.


 

Ethane proxies for methane in oil and gas emissions

PENN STATE

Research News

Measuring ethane in the atmosphere shows that the amounts of methane going into the atmosphere from oil and gas wells and contributing to greenhouse warming is higher than suggested by the U.S. Environmental Protection Agency, according to an international team of scientists who spent three years flying over three areas of the U.S. during all four seasons.

"Ethane is a gas that is related only to certain sources of methane," said Zachary R. Barkley, researcher in meteorology and atmospheric science, Penn State. "Methane, however, is produced by oil, gas and coal fields, but also by cow's digestive systems, wetlands, landfills and manure management. It is difficult to separate out fossil fuel produced and natural methane."

The Atmospheric Carbon and Transport (ACT) America data made it possible to quantify methane emissions from oil, gas and coal sources, because the project measured not only methane, but also ethane. The researchers note that methane identified with ethane can be reliably connected to fossil fuel sources, however, the ratio of ethane to methane does vary with individual sources.

"ACT America was conceived as an effort to improve our ability to diagnose the sources and sinks of global greenhouse gases, to improve the diagnosis," said Kenneth J. Davis, professor of atmospheric and climate science, Penn State. "We wanted to understand how greenhouse gases are moved around by weather systems in the atmosphere. Prior to ACT, there was no data to map out the distribution of gases in weather systems."

From 2017 through 2019, researchers flew data collection missions over three portions of the U.S. -- the central Atlantic states including Pennsylvania, New York, Virginia, West Virginia and Maryland; the central southern states including Arkansas, Louisiana, Texas, Alabama, Oklahoma and Mississippi; and the central midwestern states including Nebraska, South Dakota, Kansas, Minnesota, Iowa, Missouri, Wisconsin, Michigan, Ohio, Indiana and Illinois. The researchers covered all four seasons and tracked how weather systems moved carbon dioxide, methane, ethane and other gases around in the atmosphere.

The researchers are not the first to suggest that estimates of methane are too low, but according to Barkley, they are the first to use ethane solely as a proxy. Ethane, although it will act as a greenhouse gas, only stays in the atmosphere for a few months before breaking down into other compounds, rather than the 10 years that methane remains in the atmosphere. Ethane is more of a problem for air pollution than greenhouse warming.

"We didn't look at any of the methane data at all and we still see the same results as everyone else," he said.

Another difference is that most previous airborne studies looked at small areas, emissions from single sites or fields. ACT-America looked at multistate regions and encompassed over two-thirds of U.S. natural gas production.

"Ethane data consistently exceeds values that would be expected based on (U.S.) EPA Oil and gas leak rate estimates by more than 50%," the researchers report in a recent issue of the Journal of Geophysical Research: Atmospheres. The researchers add that comparing the combined fall, winter and spring ethane emission estimates to an inventory of oil and gas methane emissions, they estimate that the oil and gas methane emissions are larger than EPA inventory values by 48% to 76%.

The researchers used ethane-to-methane ratios from oil and gas production basins for this study.

While carbon dioxide sources and sinks can be found across the Earth's surface, ethane and methane emissions come from specific locations on the ground that are known. Deserts and oceans and upland ecosystems emit little ethane or methane. Active oil and gas fields have high emissions. When estimating trace gas emissions, researchers usually take their first best guess and then run multiple iterations to minimize the difference between observed and simulated atmospheric concentrations of these gases.

Barkley notes that sometimes signals are hard to interpret, but that that is not the case with this study.

"The data are there," said Barkley. "The smaller plume in the model when increased by a factor of two suddenly matches the real time data."

###

Other researchers at Penn State on the project are Sha Feng and Yuyan Chui, assistant research professors of meteorology and atmospheric science.

Additional project members include Alan Fried, senior research associate, Petter Weibring, research associate, Dirk Richter, research associate, and James D. Walega, senior professional research assistant, Institute of Arctic and Alpine Research, University of Colorado, Boulder; Scot M. Miller, assistant professor of environmental health and engineering, Johns Hopkins University; and Anke Roiger, M. Eckl, A. Fiehn and Julian Kostinek, Deutsches Zentrum für Luft- und Raumfahrt, Institut für Physik der Atmosphäre, Oberpfaffenhofen, Germany.

NASA supported this project.


 

Russian forests are crucial to global climate mitigation

INTERNATIONAL INSTITUTE FOR APPLIED SYSTEMS ANALYSIS

Research News

Russia is the world's largest forest country. Being home to more than a fifth of forests globally, the country's forests and forestry have enormous potential to contribute to making a global impact in terms of climate mitigation. A new study by IIASA researchers, Russian experts, and other international colleagues have produced new estimates of biomass contained in Russian forests, confirming a substantial increase over the last few decades.

Since the dissolution of the USSR, Russia has been reporting almost no changes in its forests, while data obtained from remote sensing products indicate that Russian forests have in fact experienced an increase in vegetation productivity, tree cover, and above-ground biomass in the last few decades. This has led to inconsistencies in available data and a general decline in the reliability of information on Russian forests since 1988, which can be attributed to an information gap that appeared when Russia moved from the Soviet Forest Inventory and Planning system to its current National Forest Inventory (NFI) for the collection of forest information at the national scale. The first cycle of the Russian NFI was finalized in 2020. The authors of a new IIASA-led study published in Nature Scientific Reports have used this data in combination with research forest plots on the ground and remote sensing data, in an advanced analysis to produce a new estimate of the biomass of Russian forests, confirming these forests' climate change impact and their importance for climate change mitigation.

"We set out to determine the live biomass stock and sequestration rate of Russian forests. The joint efforts of our diverse team consisting of representatives from the Russian state forestry agency, forest survey, academic research institutes, and other educational institutions, made it possible for us to produce an important, reproducible scientific result. Even more importantly, our work contributed to building mutual trust, a policy of data sharing, and hopefully, the potential for fruitful future collaboration," says study lead author Dmitry Schepaschenko, a researcher with the IIASA Agriculture, Forestry, and Ecosystem Services Research Group in the Biodiversity and Natural Resources Program.

The team were the first to be given access to a portion of primary NFI plot data with precise location information, which, as in many other countries, is normally restricted for sharing and use, under the condition that the initial data processing was physically undertaken on site at the authorized division ("Roslesinforg") of the Federal Forestry. The researchers used this data in combination with remote sensing data to estimate the growing stock of Russian forests and to assess the relative changes in post-Soviet Russia. They calibrated models relating to two global remote sensing biomass data products and additional remote sensing data layers with around 10,000 ground plots from the NFI and the Forest Observation System to reduce uncertainties and produce an unbiased estimation at jurisdictional level. By combining these two sources of information, the team were able to utilize the advantages of both sources in terms of highly accurate ground measurements and the spatially comprehensive coverage of remote sensing products and methods.

"Quite often, practitioners simply use linear regression by default without checking the underlying statistical assumptions or worrying about the difference between the ability of a model to explain the observed data and the ability to predict the future or unobserved data. Because the aim of this study was to estimate the unobserved biomass, we have used modern computationally intensive methods to focus on the goodness-of-prediction of a range of plausible models," explains study coauthor and long-time IIASA collaborator, Elena Moltchanova from the School of Mathematics and Statistics at the University of Canterbury, New Zealand.

The findings indicate that Russian forests have in fact accumulated a large amount of biomass - in the range of 40% more than the value recorded in the country's State Forest Register and reported to the statistics of the Food and Agriculture Organization of the United Nations (FAO). Using the last Soviet Union report as a reference, the results show that the growing stock accumulation rate in Russian forests between 1988 and 2014 is of the same amplitude as the net forest stock losses in tropical countries. The study's estimate of carbon sequestration in live biomass of managed forests between 1988 and 2014 is 47% higher than reported in the National Greenhouse Gases Inventory.

The authors note that while Russian forests and forestry have great potential in terms of global climate mitigation as well as numerous potential co-benefits relating to the green economy and sustainable development, it is important to highlight that as the climate becomes more severe, as in recent years, resulting forest disturbances might nullify these gains. Close collaboration of science and policy would therefore be critical to elaborate and implement adaptive forest management.

"We are talking here about the largest country in the world hosting the largest share of the largest land biome globally - the circumboreal belt of forest - which is highly climate relevant. Imagine what just a few percent up or down with regard to the amount of forest biomass available and its consequent carbon sequestration potential can make globally," says Agriculture, Forestry, and Ecosystem Services Research Group Leader and study coauthor, Florian Kraxner. "This study once again highlights the important work done by researchers of the International Boreal Forest Research Association (IBFRA), which we would like to acknowledge particularly," he concludes.

###

Reference

Shchepashchenko, D., Moltchanova, E., Fedorov, S., Karminov, V., Ontikov, P., Santoro, M., See, L., Kositsyn, V., et al. (2021). Russian forest sequesters substantially more carbon than previously reported. Scientific Reports 11 (1) e12825. DOI: 10.1038/s41598-021-92152-9 [pure.iiasa.ac.at/1720]

About IIASA:

The International Institute for Applied Systems Analysis (IIASA) is an international scientific institute that conducts research into the critical issues of global environmental, economic, technological, and social change that we face in the twenty-first century. Our findings provide valuable options to policymakers to shape the future of our changing world. IIASA is independent and funded by prestigious research funding agencies in Africa, the Americas, Asia, and Europe. http://www.iiasa.ac.at


NIST method uses radio signals to image hidden and speeding objects


NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY (NIST)



VIDEO: THIS DEMONSTRATION OF THE M-WIDAR (MICRO-WAVE IMAGE DETECTION, ANALYSIS AND RANGING) SYSTEM SHOWS, IN THE VIDEO ON THE LEFT, A PERSON WALKING AND LATER CROUCHING AND LYING DOWN IN AN... view more

CREDIT: NIST

Researchers at the National Institute of Standards and Technology (NIST) and Wavsens LLC have developed a method for using radio signals to create real-time images and videos of hidden and moving objects, which could help firefighters find escape routes or victims inside buildings filled with fire and smoke. The technique could also help track hypersonic objects such as missiles and space debris.

The new method, described in Nature Communications, could provide critical information to help reduce deaths and injuries. Locating and tracking first responders indoors is a prime goal for the public safety community. Hundreds of thousands of pieces of orbiting space junk are considered dangerous to humans and spacecraft.

"Our system allows real-time imaging around corners and through walls and tracking of fast-moving objects such as millimeter-sized space debris flying at 10 kilometers per second, more than 20,000 miles per hour, all from standoff distances," said physicist Fabio da Silva, who led the development of the system while working at NIST.

"Because we use radio signals, they go through almost everything, like concrete, drywall, wood and glass," da Silva added. "It's pretty cool because not only can we look behind walls, but it takes only a few microseconds of data to make an image frame. The sampling happens at the speed of light, as fast as physically possible."

The NIST imaging method is a variation on radar, which sends an electromagnetic pulse, waits for the reflections, and measures the round-trip time to determine distance to a target. Multisite radar usually has one transmitter and several receivers that receive echoes and triangulate them to locate an object.

"We exploited the multisite radar concept but in our case use lots of transmitters and one receiver," da Silva said. "That way, anything that reflects anywhere in space, we are able to locate and image."


CAPTION

Illustration of the lab setup for m-Widar, with transmitters and receiver at left and person behind wallboard at right. Inset at lower right shows the corresponding image produced by the instrument.

CREDIT

NIST


Da Silva explains the imaging process like this:



To image a building, the actual volume of interest is much smaller than the volume of the building itself because it's mostly empty space with sparse stuff in it. To locate a person, you would divide the building into a matrix of cubes. Ordinarily, you would transmit radio signals to each cube individually and analyze the reflections, which is very time consuming. By contrast, the NIST method probes all cubes at the same time and uses the return echo from, say, 10 out of 100 cubes to calculate where the person is. All transmissions will return an image, with the signals forming a pattern and the empty cubes dropping out.

Da Silva has applied for a patent, and he recently left NIST to commercialize the system under the name m-Widar (microwave image detection, analysis and ranging) through a startup company, Wavsens LLC (Westminster, Colorado).

The NIST team demonstrated the technique in an anechoic (non-echoing) chamber, making images of a 3D scene involving a person moving behind drywall. The transmitter power was equivalent to 12 cellphones sending signals simultaneously to create images of the target from a distance of about 10 meters (30 feet) through the wallboard.

Da Silva said the current system has a potential range of up to several kilometers. With some improvements the range could be much farther, limited only by transmitter power and receiver sensitivity, he said.

The basic technique is a form of computational imaging known as transient rendering, which has been around as an image reconstruction tool since 2008. The idea is to use a small sample of signal measurements to reconstruct images based on random patterns and correlations. The technique has previously been used in communications coding and network management, machine learning and some advanced forms of imaging.

Da Silva combined signal processing and modeling techniques from other fields to create a new mathematical formula to reconstruct images. Each transmitter emits different pulse patterns simultaneously, in a specific type of random sequence, which interfere in space and time with the pulses from the other transmitters and produce enough information to build an image.

The transmitting antennas operated at frequencies from 200 megahertz to 10 gigahertz, roughly the upper half of the radio spectrum, which includes microwaves. The receiver consisted of two antennas connected to a signal digitizer. The digitized data were transferred to a laptop computer and uploaded to the graphics processing unit to reconstruct the images.

The NIST team used the method to reconstruct a scene with 1.5 billion samples per second, a corresponding image frame rate of 366 kilohertz (frames per second). By comparison, this is about 100 to 1,000 times more frames per second than a cellphone video camera.

With 12 antennas, the NIST system generated 4096-pixel images, with a resolution of about 10 centimeters across a 10-meter scene. This image resolution can be useful when sensitivity or privacy is a concern. However, the resolution could be improved by upgrading the system using existing technology, including more transmitting antennas and faster random signal generators and digitizers.

In the future, the images could be improved by using quantum entanglement, in which the properties of individual radio signals would become interlinked. Entanglement can improve sensitivity. Radio-frequency quantum illumination schemes could increase reception sensitivity.

The new imaging technique could also be adapted to transmit visible light instead of radio signals -- ultrafast lasers could boost image resolution but would lose the capability to penetrate walls -- or sound waves used for sonar and ultrasound imaging applications.

In addition to imaging of emergency conditions and space debris, the new method might also be used to measure the velocity of shock waves, a key metric for evaluating explosives, and to monitor vital signs such as heart rate and respiration, da Silva said.

###

This work was funded in part by the Public Safety Trust Fund, which provides funding to organizations across NIST leveraging NIST expertise in communications, cybersecurity, manufacturing and sensors for research on critical, lifesaving technologies for first responders.

Paper: F.C.S. da Silva, A.B. Kos, G.E. Antonucci, J.B. Coder, C.W. Nelson and A. Hati. 2020. Continuous Capture Microwave Imaging. Nature Communications. June 25.


 COVID EXPOSED RACIST HEALTH CARE SYSTEM

COVID-linked multi-system inflammatory syndrome in children diagnosed more in Black and Latino child

New study identifies key demographic, clinical and biomarker features of MIS-C patients

CHILDREN'S NATIONAL HOSPITAL

Research News

Multisystem Inflammatory Syndrome in Children (MIS-C) significantly affected more Black and Latino children than white children, with Black children at the highest risk, according to a new observational study of 124 pediatric patients treated at Children's National Hospital in Washington, D.C. Researchers also found cardiac complications, including systolic myocardial dysfunction and valvular regurgitation, were more common in MIS-C patients who were critically ill. Of the 124 patients, 63 were ultimately diagnosed with MIS-C and were compared with 61 patients deemed controls who presented with similar symptoms but ultimately had an alternative diagnosis.

In the study, published in The Journal of Pediatrics, researchers provide insight into key features distinguishing MIS-C patients to provide a more realistic picture of the burden of disease in the pediatric population and aid with the early detection of disease and treatment for optimal outcomes. The COVID-linked syndrome has affected nearly 4,000 children in the United States in the past year. Early reports showed severe illness, substantial variation in treatment and mortality associated with MIS-C. However, this study demonstrated that with early recognition and standardized treatment, short-term mortality can be nearly eliminated.

"Data like this will be critical for the development of clinical trials around the long-term implications of MIS-C," says Dr. Roberta DeBiasi, M.D., lead author and chief of the Division of Pediatric Infectious Diseases at Children's National. "Our study sheds light on the demographic, clinical and biomarker features of this disease, as well as viral load and viral sequencing."

Of the 63 children with MIS-C, 52% were critically ill, and additional subtypes of MIS-C were identified including those with and without still detectable virus, those with and without features meeting criteria for Kawasaki Disease, and those with and without detectable cardiac abnormalities. While median age (7.25 years) and sex were similar between the MIS-C cohort and control group, Black (46%) and Latino (35%) children were overrepresented in the MIS-C group, especially those who required critical care. Heart complications were also more frequent in children who became critically ill with MIS-C (55% vs. 28%). Findings also showed MIS-C patients demonstrated a distinct cytokine signature, with significantly higher levels of certain cytokines than those of controls. This may help in the understanding of what drives the disease and which potential treatments may be most effective.

In reviewing viral load and antibody biomarkers, researchers found MIS-C cases with detectable virus had a lower viral load than in primary SARS-CoV-2 infection cases, but similar to MIS-C controls who had alternative diagnoses, but who also had detectable virus. A larger proportion of patients with MIS-C had detectable SARS-CoV-2 antibodies than controls. This is consistent with current thinking that MIS-C occurs a few weeks after a primary COVID-19 infection as part of an overzealous immune response.

Viral sequencing was also performed in the MIS-C cohort and compared to cases of primary COVID-19 infection in the Children's National geographic population. 88% of the samples analyzed fell into the GH clade consistent with the high frequency of the GH clade circulating earlier in the pandemic in the U.S. and Canada, and first observed in France.

"The fact that there were no notable sequencing differences between our MIS-C and primary COVID cohorts suggests that variations in host genetics and/or immune response are more likely primary determinants of how MIS-C presents itself, rather than virus-specific factors," says Dr. DeBiasi. "As we've seen new variants continue to emerge, it will be important to study their effect on the frequency and severity of MIS-C."

Researchers are still looking for consensus on the most efficacious treatments for MIS-C. In a recent editorial in the New England Journal of Medicine, Dr. DeBiasi calls for well-characterized large prospective cohort studies at single centers, and systematic and long-term follow-up for cardiac and non-cardiac outcomes in children with MIS-C. Data from these studies will be a crucial determinant of the best set of treatment guidelines for immunotherapies to treat MIS-C.

###

Media contact: Beth Riggs | briggs@childrensnational.org | 301-233-4038

 

AI used to predict unknown links between viruses and mammals

UNIVERSITY OF LIVERPOOL

Research News

IMAGE

IMAGE: NETWORKS OF OBSERVED AND PREDICTED ASSOCIATIONS BETWEEN WILD AND SEMI-DOMESTICATED MAMMALIAN HOSTS AND KNOWN VIRUS SPECIES. view more 

CREDIT: DR MAYA WARDEH

A new University of Liverpool study could help scientists mitigate the future spread of zoonotic and livestock diseases caused by existing viruses.

Researchers have used a form or artificial intelligence (AI) called machine-learning to predict more than 20,000 unknown associations between known viruses and susceptible mammalian species. The findings, which are published in Nature Communications, could be used to help target disease surveillance programmes.

Thousands of viruses are known to affect mammals, with recent estimates indicating that less than 1% of mammalian viral diversity has been discovered to date. Some of these viruses such as human and feline immunodeficiency viruses have a very narrow host range, whereas others such as rabies and West Nile viruses have very wide host ranges.

"Host range is an important predictor of whether a virus is zoonotic and therefore poses a risk to humans. Most recently, SARS-CoV-2 has been found to have a relatively broad host range which may have facilitated its spill-over to humans. However, our knowledge of the host range of most viruses remains limited," explains lead researcher Dr Maya Wardeh from the University's Institute of Infection, Veterinary and Ecological Sciences.

To address this knowledge gap, the researchers developed a novel machine learning framework to predict unknown associations between known viruses and susceptible mammalian species by consolidating three distinct perspectives - that of each virus, each mammal, and the network connecting them, respectively.

Their results suggests that there are more than five times as many associations between known zoonotic viruses and wild and semi-domesticated mammals than previously thought. In particular, bats and rodents, which have been associated with recent outbreaks of emerging viruses such as coronaviruses and hantaviruses, were linked with increased risk of zoonotic viruses.

The model also predicts a five-fold increase in associations between wild and semi-domesticated mammals and viruses of economically important domestic species such as livestock and pets.

Dr Wardeh said: "As viruses continue to move across the globe, our model provides a powerful way to assess potential hosts they have yet to encounter. Having this foresight could help to identify and mitigate zoonotic and animal-disease risks, such as spill-over from animal reservoirs into human populations."

Dr Wardeh is currently expanding the approach to predict the ability of ticks and insects to transmit viruses to birds and mammals, which will enable prioritisation of laboratory-based vector-competence studies worldwide to help mitigate future outbreaks of vector-borne diseases.

###