Monday, May 23, 2022

Low-cost gel film can pluck drinking water from desert air

Low-cost gel film can pluck drinking water from desert air
A prototype device for capturing water from the air using the new film. 
Credit: University of Texas at Austin

More than a third of the world's population lives in drylands, areas that experience significant water shortages. Scientists and engineers at The University of Texas at Austin have developed a solution that could help people in these areas access clean drinking water.

The team developed a low-cost gel film made of abundant materials that can pull water from the air in even the driest climates. The materials that facilitate this reaction cost a mere $2 per kilogram, and a single kilogram can produce more than 6 liters of water per day in areas with less than 15% relative humidity and 13 liters in areas with up to 30% .

The research builds on previous breakthroughs from the team, including the ability to pull water out of the atmosphere and the application of that technology to create self-watering soil. However, these technologies were designed for relatively high-humidity environments.

"This new work is about practical solutions that people can use to get water in the hottest, driest places on Earth," said Guihua Yu, professor of materials science and  in the Cockrell School of Engineering's Walker Department of Mechanical Engineering. "This could allow millions of people without consistent access to drinking water to have simple, water generating devices at home that they can easily operate."

Low-cost gel film can pluck drinking water from desert air
The water-capturing film can easily be molded into many different shapes.
 Credit: University of Texas at Austin

The new paper appears in Nature Communications.

The researchers used renewable cellulose and a common kitchen ingredient, konjac gum, as a main hydrophilic (attracted to water) skeleton. The open-pore structure of gum speeds the moisture-capturing process. Another designed component, thermo-responsive cellulose with hydrophobic (resistant to water) interaction when heated, helps release the collected water immediately so that overall energy input to produce water is minimized.

Other attempts at pulling water from desert air are typically energy-intensive and do not produce much. And although 6 liters does not sound like much, the researchers say that creating thicker films or absorbent beds or arrays with optimization could drastically increase the amount of  they yield.

The reaction itself is a simple one, the researchers said, which reduces the challenges of scaling it up and achieving mass usage.

Low-cost gel film can pluck drinking water from desert air
The process of creating the water-capturing film from its ingredients. 
Credit: University of Texas at Austin

"This is not something you need an advanced degree to use," said Youhong "Nancy" Guo, the lead author on the paper and a former doctoral student in Yu's lab, now a postdoctoral researcher at the Massachusetts Institute of Technology. "It's straightforward enough that anyone can make it at home if they have the materials."

The film is flexible and can be molded into a variety of shapes and sizes, depending on the need of the user. Making the film requires only the gel precursor, which includes all the relevant ingredients poured into a mold.

"The gel takes 2 minutes to set simply. Then, it just needs to be freeze-dried, and it can be peeled off the mold and used immediately after that," said Weixin Guan, a doctoral student on Yu's team and a lead researcher of the work. The researchers envision this as something that people could someday buy at a hardware store and use in their homes because of the simplicity.Solar-powered moisture harvester collects and cleans water from air

More information: Youhong Guo et al, Scalable super hygroscopic polymer films for sustainable moisture harvesting in arid environments, Nature Communications (2022). DOI: 10.1038/s41467-022-30505-2

Journal information: Nature Communications 

Provided by University of Texas at Austin 

Major Step Forward in Monitoring Ocean Health Through “DNA Soup”


By MONTEREY BAY AQUARIUM RESEARCH INSTITUTE MAY 23, 2022


When outfitted with a groundbreaking “laboratory in a can” to sample environmental DNA (eDNA), nimble robots like MBARI’s long-range autonomous underwater vehicle (LRAUV) can expand the monitoring of ocean health. Credit: © 2021 MBARI/Monterey Bay Aquarium

Autonomous technology uses eDNA to survey biodiversity.

In a major step forward for monitoring the biodiversity of marine systems, a new research study published on May 17, 2022, in the journal Environmental DNA details how Monterey Bay Aquarium Research Institute (MBARI) researchers are using autonomous underwater robots to sample environmental DNA (eDNA). eDNA allows scientists to detect the presence of aquatic species from the tiny bits of genetic material they leave behind. This “DNA soup” offers clues about biodiversity changes in sensitive areas, the presence of rare or endangered species, and the spread of invasive species—all critical to understanding, promoting, and preserving a healthy ocean.

Researchers combined two novel autonomous platforms developed by MBARI for this study: the long-range autonomous underwater vehicle (LRAUV) and the Environmental Sample Processor (ESP). The LRAUV is a nimble underwater robot that can travel to remote areas of the ocean for extended periods of time. The ESP is a robotic “laboratory-in-a-can” that filters seawater and preserves eDNA for future study. By equipping an LRAUV with ESP technology, researchers can expand the scale of ocean monitoring over time and space. By comparison, traditional sampling of eDNA in the ocean requires weeks on an expensive research vessel limited to a localized area. Technology innovations like this are revolutionizing ocean conservation efforts.



MBARI researchers launch a long-range autonomous underwater vehicle (LRAUV) from the R/V Paragon in Monterey Bay. These nimble robots can travel to remote areas of the ocean that are difficult for a crewed vessel to access. Credit: Kim Fulton-Bennett © 2014 MBARI

“We know that eDNA is an incredibly powerful tool for studying ocean communities, but we’ve been limited by what we can accomplish using crewed research vessels. Now, autonomous technology is helping us make better use of our time and resources to study new parts of the ocean,” said Kobun Truelove, a biological oceanographer at MBARI and the lead author on the paper.

Marine biodiversity is a measure of the abundance of individuals and species in the ocean. This interconnected mosaic of organisms—from the smallest plankton to the largest whales—supports food webs, produces the air we breathe, and regulates our climate. Autonomous tools like the LRAUV and ESP enable MBARI researchers to maintain a persistent presence in the ocean and monitor changes in sensitive ecosystems in ways that were not possible previously.

“Organisms move as conditions change in our oceans and Great Lakes, affecting the people and economies that rely on those species. We need cheaper and more nimble approaches to monitor biodiversity on a large scale. This study provides the synergistic development of eDNA and uncrewed technologies we need, in direct response to priorities laid out in the NOAA ‘Omics Strategic Plan,” said Kelly Goodwin, a co-author on the study and collaborator at the National Oceanic and Atmospheric Administration (NOAA).

Background

For this research, MBARI collaborated with researchers at the NOAA Atlantic Oceanographic and Meteorological Laboratory and the University of Washington to complete three expeditions in the Monterey Bay National Marine Sanctuary. The team coordinated sample collection between MBARI’s three research vessels, the NOAA Fisheries ship Reuben Lasker, and a fleet of MBARI’s LRAUVs.

A ship-based team lowered bottles to a specific depth to collect and preserve water samples. Meanwhile, an LRAUV equipped with an ESP autonomously sampled and preserved eDNA at similar locations and depths. The eDNA samples were returned to the lab for in-depth sequencing

Related organisms share common sections of DNA, known as gene markers. For this study, researchers analyzed eDNA samples with a technique known as metabarcoding. This method looks for short DNA excerpts and provides a breakdown of the groups present in the sample. This technique is especially helpful for translating eDNA data into a measure of biodiversity. The researchers analyzed four different types of gene markers, each representing a slightly different level of the food web. Together, the results yielded a more holistic picture of community composition. The samples collected from research ships and autonomous vehicles revealed similar patterns of biodiversity.

Truelove noted that the findings from the study mark an exciting step forward for monitoring marine ecosystems. “This work is all about increasing the scale of eDNA research. Instead of looking at an individual species, we can start to more broadly characterize biological community structure in the ocean,” he said.

“Good data are the bedrock of sustainable ocean management,” said Francisco Chavez, MBARI Senior Scientist and a co-author of the study. “Regular environmental DNA monitoring tells us who is there and what is changing over time. When it comes to understanding the impacts of climate change—one of the biggest threats to ocean health—this information is essential.”

LRAUVs are able to travel for weeks at a time and for hundreds of kilometers. They can enable more frequent sampling in areas of interest than traditional research vessels, which typically only visit remote sites infrequently. Autonomous robots will allow researchers to study previously unsurveyed regions of the ocean. Filling in these data gaps is critical to strengthening global ocean health. Ship-based research will continue to play an important role in oceanographic studies, but adding new autonomous technology to the toolkit will expand capacity for research, monitoring, and resource management. Ultimately, MBARI researchers envision deploying a fleet of LRAUVs equipped with ESP technology.

Reference: “Expanding the temporal and spatial scales of environmental DNA research with autonomous sampling” by Nathan K. Truelove, Nastassia V. Patin, Markus Min, Kathleen J. Pitz, Chris M. Preston, Kevan M. Yamahara, Yanwu Zhang, Ben Y. Raanan, Brian Kieft, Brett Hobson, Luke R. Thompson, Kelly D. Goodwin and Francisco P. Chavez, 17 May 2022, Environmental DNA.
DOI: 10.1002/edn3.299

Support for this research was provided by the David and Lucile Packard Foundation, NOAA/OAR/’Omics, NOAA/OAR/NOPP, and NASA Projects #80NSSC20M0001 and 80NSSX21M003.

About MBARI

MBARI (Monterey Bay Aquarium Research Institute) is a private non-profit oceanographic research center founded by David Packard in 1987. The mission of MBARI is to advance marine science and technology to understand a changing ocean.

Driving down the costs of hydrogen fuel: Prototype achieves 99% yield 8 times faster than conventional batch reactors

New tech aims to drive down costs of hydrogen fuel
Credit: North Carolina State University

Researchers from North Carolina State University have developed a new technique for extracting hydrogen gas from liquid carriers that is faster, less expensive and more energy efficient than previous approaches.

"Hydrogen is widely viewed as a sustainable energy source for transportation, but there are some technical obstacles that need to be overcome before it can be viewed as a practical alternative to existing technologies," says Milad Abolhasani, corresponding author of a paper on the new technique and an associate professor of chemical and biomolecular engineering at NC State. "One of the big obstacles to the adoption of a hydrogen economy is the cost of storage and transportation."

Hydrogen fuel does not result in CO2 emissions. And hydrogen refueling stations could be located at existing gas stations, taking advantage of existing infrastructure. But transporting hydrogen gas is dangerous, so hydrogen needs to be transported via a liquid . A key obstacle for this strategy is that extracting hydrogen from the liquid carrier at destination sites, such as fueling stations, is energy intensive and expensive.

"Previous research has shown that it is possible to use photocatalysts to release hydrogen gas from a liquid carrier using only sunlight," Abolhasani says. "However, existing techniques for doing this were laborious, time consuming and required a significant amount of —a metal that is very expensive."

"We've developed a technique that applies a reusable photocatalyst and sunlight to extract hydrogen gas from its liquid carrier more quickly and using less rhodium—making the entire process significantly less expensive," says Malek Ibrahim, first author of the paper and a former postdoctoral researcher at NC State. "What's more, the only byproducts are  and the liquid carrier itself, which can be reused repeatedly. It's very sustainable."

One key to the success of the new technique is that it is a continuous-flow . The reactor resembles a thin, clear tube packed with sand. The "sand" consists of micron-scale grains of titanium oxide, many of which are coated with rhodium. The hydrogen-carrying liquid is pumped into one end of the tube. The rhodium-coated particles line the outer part of the tube, where sunlight can reach them. These particles are photoreactive catalysts that, in the presence of sunlight, react with the liquid carrier to release hydrogen molecules as a gas.

The researchers precisely engineered the system so that only the outer grains of titanium oxide are coated with rhodium, ensuring the system uses no more rhodium than is necessary.

"In a conventional batch reactor, 99% of the photocatalyst is titanium oxide and 1% is rhodium," Abolhasani says. "In our continuous flow reactor, we only need to use 0.025% rhodium, which makes a big difference in the final cost. A single gram of rhodium costs more than $500."

In their prototype reactor, the researchers were able to achieve a 99% yield—meaning that 99% of the  molecules were released from the liquid carrier—in three hours.

"That's eight times faster than conventional batch reactors, which take 24 hours to reach 99% yield," Ibrahim says. "And the system should be easy to scale up or scale out to allow for catalyst reuse on commercial scale—you can simply make the tube longer or merge multiple tubes running in parallel."

The flow system can run continuously for up to 72 hours before its efficiency decreases. At this point, the catalyst can be "regenerated" without removing it from the reactor—it's a simple cleaning process that takes about six hours. The system can then be restarted and run at full efficiency for another 72 hours.

NC State has filed a provisional patent for the technology.

The research was published in the journal ChemSusChem.

Researchers devise cheaper, faster way to continuously produce amines
More information: Malek Y. S. Ibrahim et al, Continuous Room‐Temperature Hydrogen Release from Liquid Organic Carriers in a Photocatalytic Packed‐Bed Flow Reactor, ChemSusChem (2022). DOI: 10.1002/cssc.202200733
Provided by North Carolina State University 

Hydrogen production method opens up clean energy possibilities

Peer-Reviewed Publication

WASHINGTON STATE UNIVERSITY

Compressed Hydrogen 

IMAGE: POSTDOCTORAL RESEARCHER JAMIE KEE AND PROFESSOR SU HA AND THE NOVEL REACTOR THEY DEVELOPED TO PRODUCE PURE COMPRESSED HYDROGEN. view more 

CREDIT: WSU PHOTO SERVICES

PULLMAN, Wash. – A new energy-efficient way to produce hydrogen gas from ethanol and water has the potential to make clean hydrogen fuel a more viable alternative for gasoline to power cars. 

Washington State University researchers used the ethanol and water mixture and a small amount of electricity in a novel conversion system to produce pure compressed hydrogen. The innovation means that hydrogen could be made on-site at fueling stations, so only the ethanol solution would have to be transported. It is a major step in eliminating the need to transport high-pressure hydrogen gas, which has been a major stumbling block for its use as a clean energy fuel. 

“This is a new way of thinking about how to produce hydrogen gas,” said Su Ha, professor in the Gene and Linda Voiland School of Chemical Engineering and Bioengineering and corresponding author on the paper published in the journal, Applied Catalysis A. “If there are enough resources, I think it has a really good chance of making a big impact on the hydrogen economy in the near future.”

Using hydrogen as a fuel for cars is a promising but unrealized clean energy. Like an electric-powered car, a hydrogen fuel-cell powered car doesn’t emit any harmful carbon dioxide. Unlike an electric car, it can be filled up with hydrogen gas in minutes at hydrogen fueling stations.  

Despite the promise of hydrogen technology, however, storing and transporting high-pressure hydrogen gas in fuel tanks creates significant economic and safety challenges. Because of the challenges, there is little hydrogen gas infrastructure in the U.S., and the technology’s market penetration is very low.

In their work, the WSU researchers created a conversion system with an anode and a cathode. When they put a small amount of electricity into the ethanol and water mixture with a catalyst, they were able to electrochemically produce pure compressed hydrogen. Carbon dioxide from the reaction is captured in a liquid form. 

Instead of having to transport hazardous hydrogen gas, the conversion method would mean that the existing infrastructure for transporting ethanol could be used and that the compressed hydrogen gas could be easily and safely created on-demand at gas stations. 

“We’re already using ethanol-containing gasoline at every gas station,” said Ha. “You can imagine that an ethanol water mixture can be easily delivered to a local gas station using our existing infrastructure, and then using our technology, you can produce hydrogen that is ready to pump into a hydrogen fuel cell car. We don’t need to worry about hydrogen storage or transportation at all.”

The electrochemical system the team developed uses less than half the electricity of pure water splitting, another method that researchers have studied for de-carbonized hydrogen production. Instead of working hard to compress the hydrogen gas later in the process, the researchers used less energy by instead compressing the liquid ethanol mixture, thereby directly producing an already compressed hydrogen gas. 

“The presence of the ethanol in water changes the chemistry,” said graduate student Wei-Jyun Wang, a co-lead author on the paper. “We can actually do our reaction at a much lower electrical voltage than is typically needed for pure water electrolysis.” 

Their system also doesn’t require an expensive membrane that other water splitting methods do. The resulting hydrogen from the electrochemical reaction is then ready for use. 

“A process that offers a low-electrical energy cost alternative to water electrolysis and can effectively capture carbon dioxide while producing compressed hydrogen could have a significant impact on the hydrogen economy,” said Jamie Kee, a Voiland School postdoctoral researcher and one of lead authors on the paper.  “It’s really exciting because there are a whole lot of aspects that play into improving the production methods of hydrogen.” 

The researchers are working to scale up the technology and operate it in a continuous manner. They also are working to make use of the carbon dioxide captured in the liquid. 

The work was funded by the Gas Technology Institute and the US Department of Energy’s RAPID Manufacturing Institute. 

 

New life cycle assessment study shows useful life of tech-critical metals to be short 

New life cycle assessment study shows useful life of tech-critical metals to be short 
Global cycle of metals. Credit: Nature Sustainability (2022).
 DOI: 10.1038/s41893-022-00895-8

Worldwide, almost all technology-intensive industries depend on readily available metallic raw materials. Consequently, precise and reliable information is needed on how long these raw materials remain in the economic cycle. To obtain the necessary data, a research team from the universities of Bayreuth, Augsburg and Bordeaux has now developed a new modeling method and applied it to 61 metals. The study, published in Nature Sustainability, shows that the metals needed for specific high-tech applications, which in many cases are scarce around the world, are in use for only a decade on average.

The useful life of a  comprises the entire period that begins with mining and ends when it dissipates—i.e., is finely dispersed—in the environment, and is no longer available for economic use. Iron and steel alloy metals have the longest useful life, averaging 150 years. The researchers see the reason for this primarily in the high efficiency of the industrial processes in which these metals are processed, as well as in high recycling rates. The lifespan of non-ferrous metals such as aluminum and copper and precious metals such as gold and silver is significantly shorter, but it is still over 50 years. By contrast, the technology-specific and in some cases critical—i.e. hardly available—metals only remain in the economic cycle for about twelve years. Cobalt and indium are examples of this large group of raw materials. For all these calculations, data from the Bureau de Recherches Géologiques et Minières (BRGM), a geoscientific institute based in Paris and Orléans, was used.

One thing all of the 61 metals studied have in common is that the quantities lost to the economic cycle over time must be constantly compensated for by new mining. The greater the losses, the more resources are irretrievably lost, and the more damaging the consequences are for the climate and the environment.

"It is in the urgent interest of the world's population to extend the useful life of metals and to strive for economic cycles that are as closed as possible to prevent these huge losses. However, these goals can only be achieved if the useful life of every raw material relevant to our technology can be extended and calculated with greater statistical accuracy," says Prof. Dr. Christoph Helbig, Chair of the newly established Ecological Resource Technology research group at the University of Bayreuth. The aim of his research is to increase the useful life of metallic resources, and in this way contribute to environmentally and climate-friendly industries.

The calculations now published in Nature Sustainability are based on a new modeling method developed by the authors, with which the useful life of metals can be calculated far more reliably than with the usual measurements based on recycling rates. The special feature of this statistical method is that it can be applied equally to almost all metals of the periodic table. This is a decisive prerequisite for the data obtained to be comparable. Only in this way can they form a reliable basis for life cycle assessments that provide information on the extent to which valuable raw materials are being used efficiently or wasted. Life cycle assessments in the area of abiotic  look set to be considerably more meaningful thanks to the research results achieved by the study.

Prof. Dr. Christoph Helbig started work on the new study while still at the University of Augsburg and brought the topic to Bayreuth: "I am very much looking forward to continuing and developing the existing cooperation with working groups in Bordeaux and Augsburg at the University of Bayreuth," says Helbig. The University of Bordeaux is one of the partner institutions of the Gateway Office which the University of Bayreuth set up two years ago to further expand its international networking in research and teaching.New technology dramatically increases the recovery rate of precious metals 

  information: Alexandre Charpentier Poncelet et al, Losses and lifetimes of metals in the economy, Nature Sustainability (2022). DOI: 10.1038/s41893-022-00895-

Journal information: Nature Sustainability 

Provided by University of Bayreuth 

Ancient crocodile found in Peru sheds new light on their origin

crocodile
Credit: Pixabay/CC0 Public Domain

A team of researchers at Universidad Peruana Cayetano Heredia, working with colleagues from the U.S. and France, has uncovered a prehistoric crocodile fossil in Peru. In their paper published in Proceedings of the Royal Society B, the group describes their find, what they have learned about it and what it shows about the evolution of marine crocodiles.

Though there are two species of modern crocodiles that live in the , they are predominantly freshwater dwelling creatures. This feature, the researchers with this new effort note, makes it difficult to understand the evolution of the creatures from crocs that predominantly lived in the sea in the past. Also, prior research has suggested that crocodiles have been living in southeastern parts of the Pacific Ocean for approximately 14 million years. In this new effort, the researchers have been looking for evidence of early crocodiles in western parts of South America, most specifically, Peru. And as part of that effort, they have uncovered the partial remains of an ancient crocodile.

The crocodile fossil (a skull and jaw) was uncovered in East Pisco Basin, (in the Sacaco desert) in Peru in 2020. Since that time, the researchers have been studying its attributes and characteristics and have been seeking to find its place in the evolutionary history of crocodiles. Their testing has shown that the fossil is from approximately 7 million years ago. They have named it Sacacosuchus cordovai and have concluded that when alive, it would have been approximately four meters long.

The Sacaco site has been under study for a number of years: Prior fossil discoveries have shown that millions of years ago, the entire area was under the sea. Finding the crocodile fossil in the area suggests it was a saltwater creature, a finding that helps trace the evolution of crocodiles in South America.

The researchers suggest crocodiles made their way to South America by crossing the Atlantic Ocean. From there, some may have followed the coastline to arrive at what is now Peru. They further suggest that such marine  would have all had long thin faces and that there were two main types: one that lived almost exclusively on fish, and another that had a more varied diet.Four endangered American crocodiles are born in Peru

More information: Rodolfo Salas-Gismondi et al, Miocene fossils from the southeastern Pacific shed light on the last radiation of marine crocodylians, Proceedings of the Royal Society B: Biological Sciences (2022). DOI: 10.1098/rspb.2022.0380

Journal information: Proceedings of the Royal Society B 

© 2022 Science X Network

New study explains how to broaden strategy to avert catastrophic climate change

global warming
Credit: CC0 Public Domain

Slashing emissions of carbon dioxide, by itself, cannot prevent catastrophic global warming. But a new study concludes that a strategy that simultaneously reduces emissions of other largely neglected climate pollutants would cut the rate of global warming in half and give the world a fighting chance to keep the climate safe for humanity.

Published this week by the Proceedings of the National Academy of Sciences, the study is the first to analyze the importance of cutting non-carbon dioxide climate pollutants vis-à-vis merely reducing , in both the near-term and mid-term to 2050. It confirms increasing fears that the present almost exclusive focus on carbon dioxide cannot by itself prevent  from exceeding 1.5 degrees Celsius above pre-industrial levels, the internationally accepted guardrail beyond which the world's climate is expected to pass irreversible tipping points.

Indeed, such decarbonization alone would be unlikely to stop temperatures from exceeding even the much more hazardous 2 degrees Celsius limit.

The study—by scientists at Georgetown University, Texas A&M University, Scripps Institution of Oceanography at UC San Diego, and others—concludes that adopting a dual strategy that simultaneously reduces emissions of both carbon dioxide and the other climate pollutants would cut the rate of warming in half by 2050, making it much more likely to stay within these limits.

The non-carbon dioxide pollutants include methane, hydrofluorocarbon refrigerants, black carbon soot, ground-level ozone smog, as well as nitrous oxide. The study calculates that together these pollutants currently contribute almost as much to global warming as carbon dioxide. Since most of them last only a short time in the atmosphere, cutting them slows warming faster than any other mitigation strategy.

Until now, however, the importance of these non-carbon dioxide pollutants has been underappreciated by scientists and policymakers alike, and largely neglected in efforts to combat climate change.

Recent reports by the Intergovernmental Panel on Climate Change conclude that cutting fossil fuel emissions—the main source of carbon dioxide—by decarbonizing the energy system and shifting to , in isolation, actually makes global warming worse in the short term. This is because burning  also emits sulfate aerosols, which act to cool the climate, and these are reduced along with the carbon dioxide when switching to clean energy. These cooling sulfates fall out of the atmosphere fast, within days to weeks, while much of carbon dioxide lasts hundreds of years, thus leading to overall warming for the first decade or two.

The new study accounts for this effect and concludes that focusing exclusively on reducing fossil fuel emissions could result in "weak, near-term warming" which could potentially cause temperatures to exceed the 1.5 degrees Celsius level by 2035 and the 2 degrees Celsius level by 2050.

In contrast, the dual strategy that simultaneously reduces the non-carbon dioxide pollutants, especially the short-lived pollutants, would enable the world to stay well below the 2 degrees Celsius limit, and significantly improve the chance of remaining below the 1.5 degrees Celsius guardrail.

Indeed, a key insight from the study is the need for climate policies to address all of the pollutants that are emitted from fossil fuel sources such as coal power plants and  rather than considering just carbon dioxide or methane individually, as is common.

Continuing to slash fossil fuel carbon dioxide emissions remains vital, the study emphasizes, since that will determine the fate of the climate in the longer term beyond 2050. Phasing out fossil fuels also is essential because they produce air pollution that kills over eight million people every year and causes billions of dollars of damage to crops.

Tackling both  and the short-lived pollutants at the same time offers the best and the only hope of humanity making it to 2050 without triggering irreversible and potentially catastrophic  change.Increase in atmospheric methane set new record in 2021: NOAA

More information: Gabrielle B. Dreyfus et al, Mitigating Climate Disruption in Time: A self-consistent approach for avoiding both near-term and long-term global warming, Proceedings of the National Academy of Sciences (2022). DOI: 10.1073/pnas.2123536119

Journal information: Proceedings of the National Academy of Sciences 

Provided by University of California - San Diego 

Interprofessional collaboration leads to significant and sustained reduction in hospital-onset c. difficile infections

Community hospital reduced infections by 63% after one year and 77% after three years

Peer-Reviewed Publication

ASSOCIATION FOR PROFESSIONALS IN INFECTION CONTROL

Arlington, Va., May 12, 2022 – A new study published today in the American Journal of Infection Control (AJIC), suggests that health care facilities can significantly reduce the incidence of hospital-onset Clostridioides difficile infection (HO-CDI) by establishing interprofessional teams to implement selected, evidence-based infection-prevention interventions.

“Our project showed that interprofessional collaboration and continuous improvement can profoundly impact HO-CDI incidence, and sustain reductions over years,” said Cherith Walter, MSN, RN, Emory St. Joseph’s Hospital, and first author on the published study. “We hope our findings will help other healthcare teams struggling with this incredibly challenging healthcare-associated infection to improve patient safety and reduce associated costs.”

According to the Centers for Disease Control and Prevention, an estimated 500,000 cases of CDI occur in the United States annually[1], making it one of the most prevalent healthcare-associated infections (HAI) in the country. Due to the cost of caring for patients with HO-CDI, as well as financial penalties levied under the Centers for Medicare and Medical Services’ (CMS) hospital-acquired condition reduction program, these infections have increased the financial burden on the healthcare system.

To address the HO-CDI incidence at their 410-bed community hospital, which was consistently above the national CMS benchmark, Walter and colleagues created an interprofessional team comprising a clinical nurse specialist, a physician champion, a hospital epidemiologist, an infection preventionist, a clinical microbiologist, unit nurse champions, an antimicrobial stewardship pharmacist, and an environmental services representative. The team reviewed HO-CDI events at their facility between 2014 and 2016 to determine causative factors, and then identified appropriate, evidence-based infection prevention interventions. The selected interventions comprised diagnostic stewardship, including the development of a Diarrhea Decision Tree (DDT) testing algorithm with a nurse-driven ordering protocol; enhanced environmental cleaning; antimicrobial stewardship, including a system-wide Electronic Medical Record intervention to reduce fluoroquinolone use; and education and accountability, the latter of which focused on encouraging compliance with the DDT algorithm.

After the first year, the project leads recorded a 63% decrease in HO-CDIs as compared to the two years prior (4.72 per 10,000 patient days vs. 12 per 10,000 patient days). This number improved further to 2.8 per 10,000 days three years after implementation of the selected interventions (a 77% decrease from baseline). The team also saw a decrease in their facility’s standardized HO-CDI infection ratio (the total number of infections divided by the National Health Safety Network’s risk-adjusted predicted number of infections), from 1.11 in 2015 to 0.43 in 2020 – significantly lower than the national benchmark.

Interventions also improved CDI testing practices, increasing testing for appropriate patients within the first three days of hospital admission from 54% in 2014 to 81.1% in late 2019, to support prompt treatment of infected patients. This practice also helped identify and differentiate cases of community-acquired CDI (CA-CDI) from HO-CDI, reducing the financial impact of HO-CDIs on the facility after 2016. Finally, by empowering nurses to hold providers accountable for judicious test ordering and creating a system of ‘accountability notices’ alerting nurses and providers to DDT algorithm deviations, the team successfully increased compliance with the algorithm, from 50% in mid-2018 to 80% in mid-2020.

“These study findings are exciting, because they suggest that professional collaboration to consistently apply known, evidence-based practices can significantly reduce the incidence of HO-CDI, an intractable and costly HAI,” said Linda Dickey, RN, MPH, CIC, FAPIC, and 2022 APIC president. “They are also the first findings demonstrating the impact of education and accountability interventions in reducing HO-CDI incidence and improving compliance with standards of practice.”

 About APIC

Founded in 1972, the Association for Professionals in Infection Control and Epidemiology (APIC) is the leading association for infection preventionists and epidemiologists. With more than 15,000 members, APIC advances the science and practice of infection prevention and control. APIC carries out its mission through research, advocacy, and patient safety; education, credentialing, and certification; and fostering development of the infection prevention and control workforce of the future. Together with our members and partners, we are working toward a safer world through the prevention of infection. Join us and learn more at apic.org.

About AJIC

As the official peer-reviewed journal of APIC, The American Journal of Infection Control (AJIC) is the foremost resource on infection control, epidemiology, infectious diseases, quality management, occupational health, and disease prevention. Published by Elsevier, AJIC also publishes infection control guidelines from APIC and the CDC. AJIC is included in Index Medicus and CINAHL. Visit AJIC at ajicjournal.org.

NOTES FOR EDITORS

“An Interprofessional Approach to Reducing Hospital-Onset Clostridioides difficile Infections,” by Cherith Walter, MSN, RN; Tanushree Soni, PhD, MPH; Melanie Alice Gavin, MPH; Julianne Kubes, MPH; and Kristen Paciullo, PharmD, was published online in AJIC on May 12, 2022. The article may be found online at:  https://doi.org/10.1016/j.ajic.2022.02.017

 

AUTHORS

Cherith Walter, MSN, RN, APRN, AGPCNP-BC, AGCNS-BC (corresponding author: cherith.walter@emoryhealthcare.org)

Emory Saint Joseph’s Hospital, Atlanta, GA, USA

 

Tanushree Soni, PhD, MPH, CIC

Emory Saint Joseph’s Hospital, Atlanta, GA, USA

 

Melanie Alice Gavin, MPH, CIC, M (ASCP)

Emory Saint Joseph’s Hospital, Atlanta, GA, USA

 

Julianne Kubes, MPH

Emory Healthcare, Atlanta, GA, USA

 

Kristen Paciullo, PharmD, BCIDP

Emory Saint Joseph’s Hospital, Atlanta, GA, USA

 

# # #

 


[1] Guery B, Galperine T, Barbut F. Clostridioides difficile: diagnosis and treatments. BMJ. 2019 Aug 20;366:l4609.

How we perceive crowds

Slow walking may be to blame for perceived congestion in pedestrian areas

Peer-Reviewed Publication

UNIVERSITY OF TOKYO

Congestion perception 

IMAGE: IT MIGHT LOOK LIKE SOMETHING FROM A PHYSICS PAPER, BUT THIS IS ACTUALLY A PLOT SHOWING HOW PEOPLE’S PERCEPTIONS OF AN ENCLOSED SPACE CHANGE DEPENDING ON HOW IMPEDED THE CROWD WAS BY AN OBSTACLE. BLUE COLORS REFLECT PEOPLE WITHIN THE CROWD WHO WERE MORE FRUSTRATED IN THE SPACE, AND RED COLORS REFLECT PEOPLE WHO WERE MORE AT EASE. IN EACH CASE, W IS THE WIDTH OF THE OBSTACLE, AND D IS THE DISTANCE BETWEEN THE OBSTACLE AND THE EXIT. view more 

CREDIT: ©2022 JIA XIAOLU ET AL.

When designing public spaces or other places where foot traffic is considered, planners and architects need to know how people perceive the spaces in question. It is commonly believed that a space will feel more congested if the crowd density is higher. However, new research suggests that walking speed of individuals actually plays a greater role than crowd density in how someone feels about a busy space. Also, age and gender seem to affect someone’s perception of how congested an enclosed space feels to them.

If you live in a town or city, you are probably experienced in the art of navigating through crowded areas. But sometimes you can’t help but feel like your surroundings are too congested for comfort. Intuition tells us this feeling must be because of the sheer volume of people around us in these moments that causes the perception of somewhere being too congested. But Project Assistant Professor Jia Xiaolu from the Research Center for Advanced Science and Technology at the University of Tokyo wanted to verify this assumption, and ended up proving that it might not actually be the entire truth of the matter.

“Perception of congestion is an important matter for those designing spaces to be used by people, so if there’s a way to estimate this perceptual value, it would be useful to know,” said Xiaolu. “Thus, I was a little surprised to find that the density of people in a given space was not the best indicator of perceived congestion; in fact, it turned out to be the walking speed, or velocity, of the people around the perceiver.”

In order to determine this, Xiaolu and her team first had to set up an elaborate experiment. They recruited a large number of people to play the part of a crowd. The crowd was asked to walk through a relatively narrow space made out of cardboard boxes with an exit at the end and an obstacle made from boxes just before it. The researchers repeated the experiment but changed the size of the obstacle to choke the flow of the crowd, all the while recording the motions of people by using a camera and motion-tracking software.

Alongside this physical task, the crowd were also given questionnaires to fill out that captured more qualitative information about their perceptions of the crowded space during these repeated trials. By combing both quantitative and qualitative data, the researchers hoped to find a relationship between some of the quantitative parameters of the crowd and the qualitative perceptions of the crowd members.

“That the velocity of pedestrians rather than density of the crowd better indicates perceived congestion was a bit of a surprise,” said Xiaolu. “But it leads us to believe that people perceive a space too congested when they are simply unable to walk at the speed they wish to; there is a gap between their desired and actual velocity. This idea corresponds with the way people felt depending on where they were in the test space. In trials where the density was roughly uniform, slower groups around the obstacle led to those in the vicinity reporting feelings of congestion more than those prior to that section.”

The study suggests that overtaking where possible might make some people feel less constrained by the congestion, but other studies of crowd dynamics by Xiaolu and her team report that overtaking behavior can negatively impact the flow of the crowd as a whole. The team also found some noteworthy details when they analyzed the way different demographics responded to tests.

“We found that women and also older people generally felt less constrained than men and younger people, which is probably due to their lower desired velocity, thus a smaller gap between their desired and actual velocity,” said Xiaolu. “And while this is interesting, I think our future studies will focus on spaces where the objective is not so much about getting from A to B, but more goal oriented, such as interacting with a service in a store, gallery or other destination.”

  

CAPTION

These are four different methods for measuring and visualizing density, in this case, the density of people within a physically constrained space.

CREDIT

©2022 Jia Xiaolu et al.

Journal article: Xiaolu Jia, Claudio Feliciani, Hisashi Murakami, Akihito Nagahama, Daichi Yanagisawa, Katsuhiro Nishinari. “Revisiting the level-of-service framework for pedestrian comfortability: velocity depicts more accurate perceived congestion than local density”Transportation Researchhttps://doi.org/10.1016/j.trf.2022.04.007

Funding: This work was supported by the following Japanese grants: JSPS KAKENHI Grant Number JP21K14377, JP20K14992, JP20K20143, JP21H01570 and JP21H01352, and JST-Mirai Program Grant Number JPMJMI17D4 and JPMJMI20D1, Japan. This research was also partially supported by FONDAZIONE CARIPLO (Italy), “LONGEVICITY-Social Inclusion for the Elderly through Walkability”, Rif. 2017-0938.

Useful links:
Mobility Innovation Collaborative Research Organization
http://www.its.iis.u-tokyo.ac.jp/utmobi/en/  
 

Research Center for Advanced Science and Technology

https://www.rcast.u-tokyo.ac.jp/en/
 

Research Contact

Project Assistant Professor Jia Xiaolu

Research Center for Advanced Science and Technology, The University of Tokyo,
4-6-1, Komaba, Meguro-ku, Tokyo 153-8904, JAPAN

Email: xiaolujia@g.ecc.u-tokyo.ac.jp

Press contact:
Mr Rohan Mehra
Public Relations Group, The University of Tokyo
7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan
email: press-releases.adm@gs.mail.u-tokyo.ac.jp

About The University of Tokyo
The University of Tokyo is Japan's leading university and one of the world's top research universities. The vast research output of some 6,000 researchers is published in the world's top journals across the arts and sciences. Our vibrant student body of around 15,000 undergraduate and 15,000 graduate students includes over 4,000 international students. Find out more at www.u-tokyo.ac.jp/en/ or follow us on Twitter at @UTokyo_News_en.