Thursday, May 06, 2021

THE GLOBAL MANUFACTURING CAPITAL
China's greenhouse gas emissions exceed those of U.S. and developed countries combined, report says

Emma Newburger CNBC 6/5/2021

China's greenhouse gas emissions in 2019 exceeded those of the U.S. and other developed nations combined, according to research published Thursday by Rhodium Group.

China is now responsible for more than 27% of total global emissions. The U.S., the world's second-highest emitter, accounts for 11% of the global total.

The findings come after a climate summit President Joe Biden hosted last month, during which Chinese President Xi Jinping reiterated a pledge to make sure the nation's emissions peak by 2030.

© Provided by CNBC A person walks past a coal fired power plant in Jiayuguan, Gansu province, China, on Thursday, April 1, 2021.

China's greenhouse gas emissions in 2019 exceeded those of the U.S. and the developed world combined, according to a report published Thursday by research and consulting firm Rhodium Group.

The country's emissions more than tripled during the past three decades, the report added.

China is now responsible for more than 27% of total global emissions. The U.S., which is the world's second-highest emitter, accounts for 11% of the global total. India is responsible for 6.6% of global emissions, edging out the 27 nations in the EU, which account for 6.4%, the report said.

The findings come after a climate summit President Joe Biden hosted last month, during which Chinese President Xi Jinping reiterated his pledge to make sure the nation's emissions peak by 2030. He also repeated China's commitment to reach net-zero emissions by midcentury and urged countries to work together to combat the climate crisis.

"We must be committed to multilateralism," Xi said during brief remarks at the summit. "China looks forward to working with the international community, including the United States, to jointly advance global environmental governance."

Xi said China would control coal-fired generation projects and limit increases in coal consumption over the next five years, with reductions taking place in the five years following that.

However, Chinese officials have also emphasized that economic growth, which is still largely dependent on coal power, remains a priority. And the nation is still increasing construction of coal-fired power plants.

For instance, the China Development Bank and the Export-Import Bank of China together funded $474 million worth of coal projects outside China in 2020 alone. And coal accounted for more than half of China's domestic energy generation last year, according to Li Gao, director general of the Department of Climate Change at China's Ecology Ministry.

China, which is home to more than 1.4 billion people, saw its emissions surpass 14 gigatons of carbon dioxide equivalents in 2019, more than triple 1990 levels and a 25% increase over the past decade, the Rhodium report found. China's per capita emissions in 2019 also reached 10.1 tons, nearly tripling over the past two decades.

China's net emissions last year also increased by roughly 1.7% even while emissions from almost all other countries declined during the coronavirus pandemic, according to Rhodium estimates.

The Rhodium Group is a U.S. think tank that provides global emissions estimates and forecasts through the ClimateDeck, a partnership with Breakthrough Energy, an initiative founded by Bill Gates.

Slashing carbon emissions is one of the few areas on which the U.S. and China have agreed to cooperate.

Days before the summit, U.S. special envoy for climate John Kerry traveled to Shanghai to meet with officials on climate change, after which the two countries released a joint statement vowing to tackle the climate crisis together with "seriousness and urgency."

Biden has vowed to to reduce U.S. emissions by 50% to 52% by 2030, more than doubling the country's prior commitment under the 2015 Paris climate agreement.

A goal of the accord is to keep the global temperature rise well below 2 degrees Celsius, or 3.6 degrees Fahrenheit, compared with preindustrial levels. So far, the world is set to warm up by 1.5 C, or 2.7 F, over the next two decades alone.

— CNBC's Evelyn Cheng contributed reporting

 

Review: Most human origins stories are not compatible with known fossils

Fossil apes can inform us about essential aspects of ape and human evolution, including the nature of our last common ancestor

AMERICAN MUSEUM OF NATURAL HISTORY

Research News

IMAGE

IMAGE: THE LAST COMMON ANCESTOR OF CHIMPANZEES AND HUMANS REPRESENTS THE STARTING POINT OF HUMAN AND CHIMPANZEE EVOLUTION. FOSSIL APES PLAY AN ESSENTIAL ROLE WHEN IT COMES TO RECONSTRUCTING THE NATURE... view more 

CREDIT: PRINTED WITH PERMISSION FROM © CHRISTOPHER M. SMITH

In the 150 years since Charles Darwin speculated that humans originated in Africa, the number of species in the human family tree has exploded, but so has the level of dispute concerning early human evolution. Fossil apes are often at the center of the debate, with some scientists dismissing their importance to the origins of the human lineage (the "hominins"), and others conferring them starring evolutionary roles. A new review out on May 7 in the journal Science looks at the major discoveries in hominin origins since Darwin's works and argues that fossil apes can inform us about essential aspects of ape and human evolution, including the nature of our last common ancestor.

Humans diverged from apes--specifically, the chimpanzee lineage--at some point between about 9.3 million and 6.5 million years ago, towards the end of the Miocene epoch. To understand hominin origins, paleoanthropologists aim to reconstruct the physical characteristics, behavior, and environment of the last common ancestor of humans and chimps.

"When you look at the narrative for hominin origins, it's just a big mess--there's no consensus whatsoever," said Sergio Almécija, a senior research scientist in the American Museum of Natural History's Division of Anthropology and the lead author of the review. "People are working under completely different paradigms, and that's something that I don't see happening in other fields of science."

There are two major approaches to resolving the human origins problem: "Top-down," which relies on analysis of living apes, especially chimpanzees; and "bottom-up," which puts importance on the larger tree of mostly extinct apes. For example, some scientists assume that hominins originated from a chimp-like knuckle-walking ancestor. Others argue that the human lineage originated from an ancestor more closely resembling, in some features, some of the strange Miocene apes.

In reviewing the studies surrounding these diverging approaches, Almécija and colleagues with expertise ranging from paleontology to functional morphology and phylogenetics discuss the limitations of relying exclusively on one of these opposing approaches to the hominin origins problem. "Top-down" studies sometimes ignore the reality that living apes (humans, chimpanzees, gorillas, orangutans, and hylobatids) are just the survivors of a much larger, and now mostly extinct, group. On the other hand, studies based on the "bottom-up"approach are prone to giving individual fossil apes an important evolutionary role that fits a preexisting narrative.

"In The Descent of Man in 1871, Darwin speculated that humans originated in Africa from an ancestor different from any living species. However, he remained cautious given the scarcity of fossils at the time," Almécija said. "One hundred fifty years later, possible hominins--approaching the time of the human-chimpanzee divergence--have been found in eastern and central Africa, and some claim even in Europe. In addition, more than 50 fossil ape genera are now documented across Africa and Eurasia. However, many of these fossils show mosaic combinations of features that do not match expectations for ancient representatives of the modern ape and human lineages. As a consequence, there is no scientific consensus on the evolutionary role played by these fossil apes."

Overall, the researchers found that most stories of human origins are not compatible with the fossils that we have today.

"Living ape species are specialized species, relicts of a much larger group of now extinct apes. When we consider all evidence--that is, both living and fossil apes and hominins--it is clear that a human evolutionary story based on the few ape species currently alive is missing much of the bigger picture," said study co-author Ashley Hammond, an assistant curator in the Museum's Division of Anthropology.

Kelsey Pugh, a Museum postdoctoral fellow and study co-author adds, "The unique and sometimes unexpected features and combinations of features observed among fossil apes, which often differ from those of living apes, are necessary to untangle which features hominins inherited from our ape ancestors and which are unique to our lineage."

Living apes alone, the authors conclude, offer insufficient evidence. "Current disparate theories regarding ape and human evolution would be much more informed if, together with early hominins and living apes, Miocene apes were also included in the equation," says Almécija. "In other words, fossil apes are essential to reconstruct the 'starting point' from which humans and chimpanzees evolved."

###

This study was part of a collaborative effort with colleagues from the New York Institute of Technology (Nathan Thompson) and the Catalan Institute of Paleontology Miquel Crusafont (David Alba and Salvador Moyà-Solà).

Study DOI: https://science.sciencemag.org/cgi/doi/10.1126/science.abb4363

ABOUT THE AMERICAN MUSEUM OF NATURAL HISTORY (AMNH)

The American Museum of Natural History, founded in 1869 and currently celebrating its 150th anniversary, is one of the world's preeminent scientific, educational, and cultural institutions. The Museum encompasses more than 40 permanent exhibition halls, including those in the Rose Center for Earth and Space, as well as galleries for temporary exhibitions. The Museum's approximately 175 scientists draw on a world-class research collection of more than 34 million artifacts and specimens, some of which are billions of years old, and on one of the largest natural history libraries in the world. Through its Richard Gilder Graduate School, the Museum grants the Ph.D. degree in Comparative Biology and the Master of Arts in Teaching (MAT) degree, the only such free-standing, degree-granting programs at any museum in the United States. The Museum's website, digital videos, and apps for mobile devices bring its collections, exhibitions, and educational programs to millions around the world. Visit amnh.org for more information.

Forest fires drive expansion of savannas in the heart of the Amazon

Researchers analyzed the effects of wildfires on plant cover and soil quality in the last 40 years. The findings of the study show that the forest is highly vulnerable even in well-conserved areas far from the 'deforestation arc'.

FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULO

Research News

IMAGE

IMAGE: FOREST DESTROYED BY FIRE IN MIDDLE NEGRO.THE EFFECTS OF WILDFIRES ON PLANT COVER AND SOIL QUALITY view more 

CREDIT: BERNARDO MONTEIRO FLORES

Agência FAPESP – White-sand savannas are expanding in the heart of the Amazon as a result of recurring forest fires, according to a study published in the journal Ecosystems.

The study was supported by FAPESP, and conducted by Bernardo Monteiro Flores, currently a postdoctoral fellow in ecology at the Federal University of Santa Catarina (UFSC) in Brazil, and Milena Holmgren, a professor in the Department of Environmental Sciences at Wageningen University in the Netherlands.

“The edges of the Amazon Rainforest have long been considered the most vulnerable parts owing to expansion of the agricultural frontier. This degradation of the forest along the so-called ‘deforestation arc’ [a curve that hugs the southeastern edge of the forest] continues to occur and is extremely troubling. However, our study detected the appearance of savannas in the heart of the Amazon a long way away from the agricultural frontier,” Flores told Agência FAPESP.

The authors studied an area of floodplains on the middle Negro River near Barcelos, a town about 400 km upstream of Manaus, the capital of Amazonas state, where areas of white-sand savanna are expanding, although forest ecosystems still predominate. They blame the increasing frequency and severity of wildfires in the wider context of global climate change.

“We mapped 40 years of forest fires using satellite images, and collected detailed information in the field to see whether the burned forest areas were changing,” Flores said. “When we analyzed tree species richness and soil properties at different times in the past, we found that forest fires had killed practically all trees so that the clayey topsoil could be eroded by annual flooding and become increasingly sandy.”

They also found that as burnt floodplain forest naturally recovers, there is a major shift in the type of vegetation, with native herbaceous cover expanding, forest tree species disappearing, and white-sand savanna tree species becoming dominant.

Less resilient

Where do the savanna tree species come from? According to Flores, white-sand savannas are part of the Amazon ecosystem, covering about 11% of the biome. They are ancient savannas and very different from the Cerrado with its outstanding biodiversity, yet even so they are home to many endemic plant species. They are called campinas by the local population. Seen from above, the Amazon is an ocean of forest punctuated by small islands of savanna. The seeds of savanna plants are distributed by water, fish and birds, and are more likely than forest species to germinate when they reach a burnt area with degraded soil, repopulating the area concerned.

“Our research shows native savanna cover is expanding and may continue expanding in the Amazon. Not along the ‘deforestation arc’, where exotic grasses are spreading, but in remote areas throughout the basin where white-sand savannas already exist,” Flores said.

It is important to stress that in the Amazon floodplain forest is far less resilient than upland terra firma forest. It burns more easily, after which its topsoil is washed away and degrades much more rapidly. “Floodplain forest is the ‘Achilles heel’ of the Amazon,” Holmgren said. “We have field evidence that if the climate becomes drier in the Amazon and wildfires become more severe and frequent, floodplain forest will be the first to collapse.”

These two factors – a drier climate, and more severe and frequent fires – are already in play as part of the ongoing climate change crisis. The study shows that wildfires in the middle Negro area during the severe 2015-16 El Niño burned down an area seven times larger than the total area destroyed by fire in the preceding 40 years.

“The additional loss of floodplain forest could result in huge emissions of carbon stored in trees, soil and peatlands, as well as reducing supplies of resources used by local people, such as fish and forest products. The new discoveries reinforce the urgency of defending remote forest areas. For example, a fire management program should be implemented to reduce the spread of wildfires during the dry season,” Flores said.

The article “White-sand savannas expand at the core of the Amazon after forest wildfires” is at: link.springer.com/article/10.1007%2Fs10021-021-00607-x.

 

Johns Hopkins scientists model Saturn's interior

Researchers simulate conditions necessary for planet's unique magnetic field

JOHNS HOPKINS UNIVERSITY

Research News

IMAGE

IMAGE: THE MAGNETIC FIELD OF SATURN SEEN AT THE SURFACE. view more 

CREDIT: ANKIT BARIK/JOHNS HOPKINS UNIVERSITY

New Johns Hopkins University simulations offer an intriguing look into Saturn's interior, suggesting that a thick layer of helium rain influences the planet's magnetic field.

The models, published this week in AGU Advances, also indicate that Saturn's interior may feature higher temperatures at the equatorial region, with lower temperatures at the high latitudes at the top of the helium rain layer.

It is notoriously difficult to study the interior structures of large gaseous planets, and the findings advance the effort to map Saturn's hidden regions.

"By studying how Saturn formed and how it evolved over time, we can learn a lot about the formation of other planets similar to Saturn within our own solar system, as well as beyond it," said co-author Sabine Stanley, a Johns Hopkins planetary physicist.

Saturn stands out among the planets in our solar system because its magnetic field appears to be almost perfectly symmetrical around the rotation axis. Detailed measurements of the magnetic field gleaned from the last orbits of NASA's Cassini mission provide an opportunity to better understand the planet's deep interior, where the magnetic field is generated, said lead author Chi Yan, a Johns Hopkins PhD candidate.

By feeding data gathered by the Cassini mission into powerful computer simulations similar to those used to study weather and climate, Yan and Stanley explored what ingredients are necessary to produce the dynamo--the electromagnetic conversion mechanism--that could account for Saturn's magnetic field.

"One thing we discovered was how sensitive the model was to very specific things like temperature," said Stanley, who is also a Bloomberg Distinguished Professor at Johns Hopkins in the Department of Earth & Planetary Sciences and the Space Exploration Sector of the Applied Physics Lab. "And that means we have a really interesting probe of Saturn's deep interior as far as 20,000 kilometers down. It's a kind of X-ray vision."

Strikingly, Yan and Stanley's simulations suggest that a slight degree of non-axisymmetry could actually exist near Saturn's north and south poles.

"Even though the observations we have from Saturn look perfectly symmetrical, in our computer simulations we can fully interrogate the field," said Stanley.

Direct observation at the poles would be necessary to confirm it, but the finding could have implications for understanding another problem that has vexed scientists for decades: how to measure the rate at which Saturn rotates, or, in other words, the length of a day on the planet.

This project was conducted using computational resources at the Maryland Advanced Research Computing Center (MARCC).


CAPTION

Saturn's interior with stably stratified Helium Insoluble Layer.

CREDIT

Yi Zheng (HEMI/MICA Extreme Arts Program)


 

New application of AI just removed one of the biggest roadblocks in astrophysics

Using neural networks, Flatiron Institute research fellow Yin Li and his colleagues simulated vast, complex universes in a fraction of the time it takes with conventional methods

SIMONS FOUNDATION

Research News

IMAGE

IMAGE: SIMULATIONS OF A REGION OF SPACE 100 MILLION LIGHT-YEARS SQUARE. THE LEFTMOST SIMULATION RAN AT LOW RESOLUTION. USING MACHINE LEARNING, RESEARCHERS UPSCALED THE LOW-RES MODEL TO CREATE A HIGH-RESOLUTION SIMULATION... view more 

CREDIT: Y. LI ET AL./PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES 2021

Using a bit of machine learning magic, astrophysicists can now simulate vast, complex universes in a thousandth of the time it takes with conventional methods. The new approach will help usher in a new era in high-resolution cosmological simulations, its creators report in a study published online May 4 in Proceedings of the National Academy of Sciences.

"At the moment, constraints on computation time usually mean we cannot simulate the universe at both high resolution and large volume," says study lead author Yin Li, an astrophysicist at the Flatiron Institute in New York City. "With our new technique, it's possible to have both efficiently. In the future, these AI-based methods will become the norm for certain applications."

The new method developed by Li and his colleagues feeds a machine learning algorithm with models of a small region of space at both low and high resolutions. The algorithm learns how to upscale the low-res models to match the detail found in the high-res versions. Once trained, the code can take full-scale low-res models and generate 'super-resolution' simulations containing up to 512 times as many particles.

The process is akin to taking a blurry photograph and adding the missing details back in, making it sharp and clear.

This upscaling brings significant time savings. For a region in the universe roughly 500 million light-years across containing 134 million particles, existing methods would require 560 hours to churn out a high-res simulation using a single processing core. With the new approach, the researchers need only 36 minutes.

The results were even more dramatic when more particles were added to the simulation. For a universe 1,000 times as large with 134 billion particles, the researchers' new method took 16 hours on a single graphics processing unit. Existing methods would take so long that they wouldn't even be worth running without dedicated supercomputing resources, Li says.

Li is a joint research fellow at the Flatiron Institute's Center for Computational Astrophysics and the Center for Computational Mathematics. He co-authored the study with Yueying Ni, Rupert Croft and Tiziana Di Matteo of Carnegie Mellon University; Simeon Bird of the University of California, Riverside; and Yu Feng of the University of California, Berkeley.

Cosmological simulations are indispensable for astrophysics. Scientists use the simulations to predict how the universe would look in various scenarios, such as if the dark energy pulling the universe apart varied over time. Telescope observations may then confirm whether the simulations' predictions match reality. Creating testable predictions requires running simulations thousands of times, so faster modeling would be a big boon for the field.

Reducing the time it takes to run cosmological simulations "holds the potential of providing major advances in numerical cosmology and astrophysics," says Di Matteo. "Cosmological simulations follow the history and fate of the universe, all the way to the formation of all galaxies and their black holes."

So far, the new simulations only consider dark matter and the force of gravity. While this may seem like an oversimplification, gravity is by far the universe's dominant force at large scales, and dark matter makes up 85 percent of all the 'stuff' in the cosmos. The particles in the simulation aren't literal dark matter particles but are instead used as trackers to show how bits of dark matter move through the universe.

The team's code used neural networks to predict how gravity would move dark matter around over time. Such networks ingest training data and run calculations using the information. The results are then compared to the expected outcome. With further training, the networks adapt and become more accurate.

The specific approach used by the researchers, called a generative adversarial network, pits two neural networks against each other. One network takes low-resolution simulations of the universe and uses them to generate high-resolution models. The other network tries to tell those simulations apart from ones made by conventional methods. Over time, both neural networks get better and better until, ultimately, the simulation generator wins out and creates fast simulations that look just like the slow conventional ones.

"We couldn't get it to work for two years," Li says, "and suddenly it started working. We got beautiful results that matched what we expected. We even did some blind tests ourselves, and most of us couldn't tell which one was 'real' and which one was 'fake.'"

Despite only being trained using small areas of space, the neural networks accurately replicated the large-scale structures that only appear in enormous simulations.

The simulations don't capture everything, though. Because they focus only on dark matter and gravity, smaller-scale phenomena -- such as star formation, supernovae and the effects of black holes -- are left out. The researchers plan to extend their methods to include the forces responsible for such phenomena, and to run their neural networks 'on the fly' alongside conventional simulations to improve accuracy. "We don't know exactly how to do that yet, but we're making progress," Li says.

###

Why robots need reflexes - interview

Robots could safeguard people from pain

TECHNICAL UNIVERSITY OF MUNICH (TUM)

Research News

Reflexes protect our bodies - for example when we pull our hand back from a hot stove. These protective mechanisms could also be useful for robots. In this interview, Prof. Sami Haddadin and Johannes Kühn of the Munich School of Robotics and Machine Intelligence (MSRM) of the Technical University of Munich (TUM) explain why giving test subjects a "slap on the hand" could lay the foundations for the robots of the future.

In your paper, published in Scientific Reports, you describe an experimental setup where people were actually slapped on the hand - to study their reflexes....

Kühn: Yes, you can put it that way. For our study, in cooperation with Imperial College London, the test subjects needed their reflexes to protect them against two different pain sources: first, a slap on the hand. And, while pulling their hand and arm out of harm's way, they also had to avoid an elbow obstacle. We studied the hand retraction and discovered that it is a highly coordinated motion.

We also observed that the pain anticipated by a person shapes the reflex: If I know that the object behind me will cause similar pain to the slap on my fingers, I will withdraw my hand differently than when I know that the object will cause no pain.

How can such a seemingly simple experiment contribute to the development of intelligent high-tech machines like robots?

Haddadin: Humans have fascinating abilities. One could speak of a built-in intelligence in the human body that is indispensable for survival. The protective reflex is a central part of this. Imagine the classical "hand on the hot stove" situation. Without thinking, we pull back our hand as soon as the skin senses heat. So far, robots do not have reflexes of this kind. Their reactions to impending collisions tend to be rather mindless: They just stop and don't move until a person takes action.

In some situations this might make sense. But if a robot simply stopped moving when touching a hot stove, this would obviously have fatal consequences. At the MSRM we are therefore interested in developing autonomous and intelligent reflex mechanisms as part of a central nervous system for robots, so to speak. Humans are serving as our role models. How do their reflexes work and what can we learn from them for the development of intelligent robots?

What conclusions can you draw from your experiment for the development of robots?

Kühn: We gained an insight into how the reflex motion works in detail: The way humans coordinate the reflex can be seen as throwing the shoulder forward, in a sense, in order to accelerate the withdrawal of the hand. This principle could be applied in the development of reflex motions in humanoid robots, with a signal sent to one part of a robot in order to influence the motion of another one.

This knowledge will also be helpful in the design of robot-enabled prosthetics that are expected to perform in "human-like" ways.

You mentioned that "anticipated pain" played a role in your experiment. Should robots be able to anticipate pain, too?

Kühn: That would be a big advantage. It could help to classify potential collisions based on danger levels - and to plan evasive actions if appropriate. This would not only ensure the safety of the robot.

If the robot were capable of anticipating human pain, it could intervene in a dangerous situation to save a person from experiencing this pain.

Would robots then need to learn how to feel pain in the same way as humans?

Haddadin: No. Our pain perception is highly complex and linked to emotions. So we can't compare this to a human's "pain sensation". Robots are tools and not living creatures. Artificial pain is nothing more than a technical signal based on data from various sensors. At the MSRM we have already developed an initial reflex mechanism for robots based on "artificial pain". When touching hot or sharp objects, our robot withdrew its arm in a reflexive movement.

What are your next steps on the way to a robot with a fully developed protective reflex?

Haddadin: The big challenge in our research field between humans and machines is that we still have only a rudimentary understanding of our role model: the human reflex system, working with the sensorimotoric learning mechanisms of a complex, neuromechanical motion apparatus. And that is where the exciting scientific challenge lies: with all of the unknowns, to continually improve the human-inspired abilities of our intelligent machines, while using what we learn to arrive at a better understanding of how humans function. Basically, we can say that this has continued since the days of Leonardo Da Vinci and will carry on for many years to come.

###

An uncrackable combination of invisible ink and artificial intelligence

AMERICAN CHEMICAL SOCIETY

Research News

IMAGE

IMAGE: WITH REGULAR INK, A COMPUTER TRAINED WITH THE CODEBOOK DECODES "STOP " (TOP); WHEN A UV LIGHT IS SHOWN ON THE PAPER, THE INVISIBLE INK IS EXPOSED, AND THE REAL MESSAGE... view more 

CREDIT: ADAPTED FROM ACS APPLIED MATERIALS & INTERFACES 2021, DOI: 10.1021/ACSAMI.1C01179

Coded messages in invisible ink sound like something only found in espionage books, but in real life, they can have important security purposes. Yet, they can be cracked if their encryption is predictable. Now, researchers reporting in ACS Applied Materials & Interfaces have printed complexly encoded data with normal ink and a carbon nanoparticle-based invisible ink, requiring both UV light and a computer that has been taught the code to reveal the correct messages.

Even as electronic records advance, paper is still a common way to preserve data. Invisible ink can hide classified economic, commercial or military information from prying eyes, but many popular inks contain toxic compounds or can be seen with predictable methods, such as light, heat or chemicals. Carbon nanoparticles, which have low toxicity, can be essentially invisible under ambient lighting but can create vibrant images when exposed to ultraviolet (UV) light - a modern take on invisible ink. In addition, advances in artificial intelligence (AI) models -- made by networks of processing algorithms that learn how to handle complex information -- can ensure that messages are only decipherable on properly trained computers. So, Weiwei Zhao, Kang Li, Jie Xu and colleagues wanted to train an AI model to identify and decrypt symbols printed in a fluorescent carbon nanoparticle ink, revealing hidden messages when exposed to UV light.

The researchers made carbon nanoparticles from citric acid and cysteine, which they diluted with water to create an invisible ink that appeared blue when exposed to UV light. The team loaded the solution into an ink cartridge and printed a series of simple symbols onto paper with an inkjet printer. Then, they taught an AI model, composed of multiple algorithms, to recognize symbols illuminated by UV light and decode them using a special codebook. Finally, they tested the AI model's ability to decode messages printed using a combination of both regular red ink and the UV fluorescent ink. With 100% accuracy, the AI model read the regular ink symbols as "STOP", but when a UV light was shown on the writing, the invisible ink illustrated the desired message "BEGIN". Because these algorithms can notice minute modifications in symbols, this approach has the potential to encrypt messages securely using hundreds of different unpredictable symbols, the researchers say.

###

The authors acknowledge funding from the Shenzhen Peacock Team Plan and the Bureau of Industry and Information Technology of Shenzhen through the Graphene Manufacturing Innovation Center (201901161514).

The abstract that accompanies this paper can be viewed here.

The American Chemical Society (ACS) is a nonprofit organization chartered by the U.S. Congress. ACS' mission is to advance the broader chemistry enterprise and its practitioners for the benefit of Earth and all its people. The Society is a global leader in promoting excellence in science education and providing access to chemistry-related information and research through its multiple research solutions, peer-reviewed journals, scientific conferences, eBooks and weekly news periodical Chemical & Engineering News. ACS journals are among the most cited, most trusted and most read within the scientific literature; however, ACS itself does not conduct chemical research. As a leader in scientific information solutions, its CAS division partners with global innovators to accelerate breakthroughs by curating, connecting and analyzing the world's scientific knowledge. ACS' main offices are in Washington, D.C., and Columbus, Ohio.

HEPA filter effectively reduces airborne respiratory particles generated during vigorous exercise

MAYO CLINIC

Research News

ROCHESTER, Minn. -- A pair of Mayo Clinic studies shed light on something that is typically difficult to see with the eye: respiratory aerosols. Such aerosol particles of varying sizes are a common component of breath, and they are a typical mode of transmission for respiratory viruses like COVID-19 to spread to other people and surfaces.

Researchers who conduct exercise stress tests for heart patients at Mayo Clinic found that exercising at increasing levels of exertion increased the aerosol concentration in the surrounding room. Then also found that a high-efficiency particulate air (HEPA) device effectively filtered out the aerosols and decreased the time needed to clear the air between patients.

"Our work was conducted with the support of Mayo Cardiovascular Medicine leadership who recognized right at the start of the pandemic that special measures would be required to protect patients and staff from COVID-19 while continuing to provide quality cardiovascular care to all who needed it," says Thomas Allison, Ph.D., director of Cardiopulmonary Exercise Testing at Mayo Clinic in Rochester. "Since there was no reliable guidance on how to do this, we put a research team together to find answers through scientific testing and data. We are happy to now share our findings with everyone around the world." Dr. Allison is senior author of both studies.

To characterize the aerosols generated during various intensities of exercise in the first study, Dr. Allison's team set up a special aerosol laboratory in a plastic tent with controlled airflow. Two types of laser beam particle counters were used to measure aerosol concentration at the front, back and sides of a person riding an exercise bike. Eight exercise volunteers wore equipment to measure their oxygen consumption, ventilation and heart rate.

During testing, a volunteer first had five minutes of resting breathing, followed by four bouts of three-minute exercise staged ? with monitoring and coaching ? to work at 25%, 50%, 75% and 100% of their age-predicted heart rate. This effort was followed by three minutes of cooldown. The findings are publicized online in CHEST.

The aerosol concentrations increased exponentially throughout the test. Specifically, exercise at or above 50% of resting heart rate showed significant increases in aerosol concentration.

"In a real sense, I think we have proven dramatically what many suspected ? that is why gyms were shut down and most exercise testing laboratories closed their practices. Exercise testing was not listed as an aerosol-generating procedure prior to our studies because no one had specifically studied it before. Exercise generates millions of respiratory aerosols during a test, many of a size reported to have virus-carrying potential. The higher the exercise intensity, the more aerosols are produced," says Dr. Allison.

The follow-up study led by Dr. Allison focused on how to mitigate the aerosols generated during exercise testing by filtering them out of the air immediately after they came out of the subject's mouth. Researchers used a similar setup with the controlled airflow exercise tent, particle counter and stationary bike, but added a portable HEPA filter with a flume hood.

Six healthy volunteers completed the same 20-minute exercise test as the previous study, first without the mitigation and then with the portable HEPA filter running.

Also, a separate experiment tested aerosol clearance time in the clinical exercise testing laboratories by using artificially generated aerosols to test how long it took for 99.9% of aerosols to be removed. Researchers performed the test first with only existing heating, ventilation and air conditioning, and then with the addition of the portable HEPA filter running.

"Studying clearance time informed us of how soon we could safely bring a new patient into the laboratory after finishing the test on the previous patient. HEPA filters cut this time by 50%, allowing the higher volume of testing necessary to meet the clinical demands of our Cardiovascular Medicine practice," says Dr. Allison.

"We translated CDC (Centers for Disease Control and Prevention) guidelines for aerosol mitigation with enhanced airflow through HEPA filters and showed that it worked amazingly well for exercise testing. We found that 96% plus or minus 2% of aerosols of all sizes generated during heavy exercise were removed from the air by the HEPA filter. As a result, we have been able to return to our practice of performing up to 100 stress tests per day without any recorded transmission of COVID in our exercise testing laboratories," says Dr. Allison.

###

About Mayo Clinic

Mayo Clinic is a nonprofit organization committed to innovation in clinical practice, education and research, and providing compassion, expertise and answers to everyone who needs healing. Visit the Mayo Clinic News Network for additional Mayo Clinic news. For information on COVID-19, including Mayo Clinic's Coronavirus Map tracking tool, which has 14-day forecasting on COVID-19 trends, visit the Mayo Clinic COVID-19 Resource Center.

UNC Charlotte researchers analyzed the host origins of SARS-CoV-2 and other coronaviruses

UNIVERSITY OF NORTH CAROLINA AT CHARLOTTE

Research News

IMAGE

IMAGE: THIS TREE IS A SUMMARY OF THE SELECTED HOST TRANSFORMATIONS IN THE CLADE OF BETACORONAVIRUS ASSOCIATED WITH SARS-COV, MERS-COV, AND SARS-COV-2. BATS HAVE BEEN FUNDAMENTAL HOSTS OF THESE HUMAN CORONAVIRUSES.... view more 

CREDIT: DENIS JACOB MACHADO

Coronavirus (CoVs) infection in animals and humans is not new. The earliest papers in the scientific literature of coronavirus infection date to 1966. However, prior to SARS-CoV, MERS-CoV, and SARS-CoV-2, very little attention had been paid to coronaviruses.

Suddenly, coronaviruses changed everything we know about personal and public health, and societal and economic well-being. The change led to rushed analyses to understand the origins of coronaviruses in humans. This rush has led to a thus far fruitless search for intermediate hosts (e.g., civet in SARS-CoV and pangolin in SARS-CoV-2) rather than focusing on the important work, which has always been surveillance of SARS-like viruses in bats.

To clarify the origins of coronavirus' infections in humans, researchers from the Bioinformatics Research Center (BRC) at the University of North Carolina at Charlotte (UNC Charlotte) performed the largest and most comprehensive evolutionary analyses to date. The UNC Charlotte team analyzed over 2,000 genomes of diverse coronaviruses that infect humans or other animals.

"We wanted to conduct evolutionary analyses based on the most rigorous standards of the field," said Denis Jacob Machado, the first author of the paper. "We've seen rushed analyses that had different problems. For example, many analyses had poor sampling of viral diversity or placed excessive emphasis on overall similarity rather than on the characteristics shared due to common evolutionary history. It was very important to us to avoid those mistakes to produce a sound evolutionary hypothesis that could offer reliable information for future research."

The study's major conclusions are:

    1) Bats have been ancestral hosts of human coronaviruses in the case of SARS-CoV and SARS-CoV-2. Bats also were the ancestral hosts of MERS-CoV infections in dromedary camels that spread rapidly to humans.

    2) Transmission of MERS-CoV among camels and their herders evolved after the transmission from bats to these hosts. Similarly, there was transmission of SARS-CoV after the bat to human transmission among human vendors and their civets. These events are similar to the transmission of SARS-CoV-2 by fur farmers to their minks. The evolutionary analysis in this study helps to elucidate that these events occurred after the original human infection from lineages of coronaviruses hosted in bats. Therefore, these secondary transmissions to civet or mink did not play a role in the fundamental emergence of human coronaviruses.

    3) The study corroborates the animal host origins of other human coronaviruses, such as HCoV-NL63 (from bat hosts), HCoV-229E (from camel hosts), HCoV-HKU1 (from rodent hosts) and HCoV-OC43 and HECV-4408 (from cow hosts).

    4) Transmission of coronaviruses from animals to humans occurs episodically. From 1966 to 2020, the scientific community has described eight human-hosted lineages of coronaviruses. Although it is difficult to predict when a new human hosted coronavirus could emerge, the data indicate that we should prepare for that possibility.

"As coronavirus transmission from animal to human host occurs episodically at unpredictable intervals, it is not wise to attempt to time when we will experience the next human coronavirus," noted professor Daniel A. Janies, Carol Grotnes Belk Distinguished Professor of Bioinformatics and Genomics and team leader for the study. "We must conduct research on viruses that can be transferred from animals to humans on a continuous rather than reactionary basis."

###

"Fundamental evolution of all Orthocoronavirinae including three deadly lineages descendent from Chiroptera-hosted coronaviruses: SARS-CoV, MERS-CoV, and SARS-CoV-2" was published online in the journal Cladistics on April 26, 2021. The authors are Denis Jacob Machado, Rachel Scott, Sayal Guirales, and Daniel A. Janies. The article's digital object number is 10.1111/cla.12454.

Article's URL: http://doi.org/10.1111/cla.12454.

I WANT ONE

Personalized sweat sensor reliably monitors blood glucose without finger pricks

AMERICAN CHEMICAL SOCIETY

Research News

IMAGE

IMAGE: A HAND-HELD DEVICE COMBINED WITH A TOUCH SWEAT SENSOR (STRIP AT RIGHT) MEASURES GLUCOSE IN SWEAT, WHILE A PERSONALIZED ALGORITHM CONVERTS THAT DATA INTO A BLOOD GLUCOSE LEVEL. view more 

CREDIT: ADAPTED FROM ACS SENSORS 2021, DOI: 10.1021/ACSSENSORS.1C00139

Many people with diabetes endure multiple, painful finger pricks each day to measure their blood glucose. Now, researchers reporting in ACS Sensors have developed a device that can measure glucose in sweat with the touch of a fingertip, and then a personalized algorithm provides an accurate estimate of blood glucose levels.

According to the American Diabetes Association, more than 34 million children and adults in the U.S. have diabetes. Although self-monitoring of blood glucose is a critical part of diabetes management, the pain and inconvenience caused by finger-stick blood sampling can keep people from testing as often as they should. Scientists have developed ways to measure glucose in sweat, but because levels of the sugar are much lower than in blood, they can vary with a person's sweat rate and skin properties. As a result, the glucose level in sweat usually doesn't accurately reflect the value in blood. To obtain a more reliable estimate of blood sugar from sweat, Joseph Wang and colleagues wanted to devise a system that could collect sweat from a fingertip, measure glucose and then correct for individual variability.

The researchers made a touch-based sweat glucose sensor with a polyvinyl alcohol hydrogel on top of an electrochemical sensor, which was screen-printed onto a flexible plastic strip. When a volunteer placed their fingertip on the sensor surface for 1 minute, the hydrogel absorbed tiny amounts of sweat. Inside the sensor, glucose in the sweat underwent an enzymatic reaction that resulted in a small electrical current that was detected by a hand-held device. The researchers also measured the volunteers' blood sugar with a standard finger-prick test, and they developed a personalized algorithm that could translate each person's sweat glucose to their blood glucose levels. In tests, the algorithm was more than 95% accurate in predicting blood glucose levels before and after meals. To calibrate the device, a person with diabetes would need a finger prick only once or twice per month. But before the sweat diagnostic can be used to manage diabetes, a large-scale study must be conducted, the researchers say.

###

The authors acknowledge funding from the University of California San Diego Center for Wearable Sensors and the National Research Foundation of Korea.

The abstract that accompanies this paper is available here.

The American Chemical Society (ACS) is a nonprofit organization chartered by the U.S. Congress. ACS' mission is to advance the broader chemistry enterprise and its practitioners for the benefit of Earth and all its people. The Society is a global leader in promoting excellence in science education and providing access to chemistry-related information and research through its multiple research solutions, peer-reviewed journals, scientific conferences, eBooks and weekly news periodical Chemical & Engineering News. ACS journals are among the most cited, most trusted and most read within the scientific literature; however, ACS itself does not conduct chemical research. As a leader in scientific information solutions, its CAS division partners with global innovators to accelerate breakthroughs by curating, connecting and analyzing the world's scientific knowledge. ACS' main offices are in Washington, D.C., and Columbus, Ohio.

To automatically receive news releases from the American Chemical Society, contact newsroom@acs.org.  

Follow us: Twitter | Facebook