Friday, November 04, 2022

Violent supershear earthquakes are more common than previously thought

Peer-Reviewed Publication

UNIVERSITY OF CALIFORNIA - LOS ANGELES

Powerful supershear earthquakes, once considered rare, are much more common than previously thought, according to a study led by UCLA geophysicists and published today in Nature Geoscience.

The scientists analyzed all 6.7-or-greater magnitude strike-slip earthquakes worldwide since 2000 — there were 87 in all — and identified 12 of the supershear type, or about 14%. (Four of those earthquakes were previously unreported.)

That percentage is more than double what scientists expected; until now less than 6% of strike-slip earthquakes had been identified as supershear.

Strike-slip earthquakes occur when the edges of two tectonic plates rub sideways against each other. Supershear quakes are a subtype of that group that are caused when faults beneath the surface rupture faster than shear waves — the seismic waves that shake the ground back and forth — can move through rock. The effect corrals energy that is then released violently; the effect can be compared to a sonic boom.

As a result, supershear earthquakes tend to cause more shaking, and are potentially more destructive, than other earthquakes that have the same magnitude.

“When an airplane flies faster than sound can travel through air, a cone of pent-up sound waves forms in front of the plane and when it catches up, we hear it all at once,” said Lingsen Meng, UCLA’s Leon and Joanne V.C. Knopoff Professor of Physics and Geophysics, and the paper’s corresponding author. “Supershear earthquakes are potentially more destructive than other kinds of earthquakes because they are more effective at generating seismic waves, with more shaking, which could cause more damage.”

The research also found that supershear earthquakes occur as commonly beneath the oceans as they do on land, and that they are most likely to occur along strike-slip faults, such as California’s San Andreas Fault.

The findings suggest that disaster planning efforts should take into consideration whether nearby faults are capable of producing supershear earthquakes and, if so, take measures to prepare for a higher level of shaking and potential damage than could be caused by non-supershear earthquakes.

Meng said the reason relatively few supershear earthquakes have been found is that researchers mainly study earthquakes on land.

The paper’s co-authors are UCLA doctoral students Han Bao and Liuwei Xu of UCLA and Jean-Paul Ampuero, a senior researcher at UniversitĂ© CĂ´te d’Azur in Nice, France.

The scientists used a method called backprojection to determine the direction in which seismic waves arrived to infer how fast an earthquake moves along the fault. The technique applies an algorithm to analyze brief time delays between seismic waves as they’re detected by a group of sensors. The method is similar to how one can locate a person by tracking the signals their smartphone sends to cell towers.

The data revealed supershear earthquakes tend to occur on mature strike-slip faults, in which the edges of two continental plates rub laterally against each other. In a mature fault, that action has been happening for long enough to create a zone of damaged rocks that act like a dam around the fault, slowing or blocking seismic wave propagation and concentrating their energy.

Ampuero said the findings could help scientists better understand what it takes for a fault to produce the kinds of ruptures that lead to supershear earthquakes.

In the past century, at least one large supershear earthquake has occurred in California: In 1979, a 6.5-magnitude quake in Southern California’s Imperial Valley region injured people as far away as Mexico and caused extensive damage to irrigation systems. And, although it predated scientific monitoring, the 1906 earthquake that caused extensive damage in San Francisco likely also fell into the supershear category.

Not all supershear earthquakes are that disastrous. The shape of the fault, the rocks around it, and other factors can affect the propagation of seismic waves and limit the accumulation of energy. Faults that curve tend to slow, deflect or absorb seismic waves, while straight faults let them flow freely.

In a previous study, Meng’s research group identified the catastrophic 7.5 magnitude earthquake that hit the Indonesian island of Sulawesi in 2018 as a supershear event. The temblor and ensuing tsunami killed at least 4,000 people. Despite the curve in the Indonesian earthquake fault, the horrific damage occurred because the fault moved faster than any previously recorded and energy from earlier temblors likely were stored up in the rocks, awaiting a moment to burst, Meng said.

Fortunately, Meng said supershear earthquakes in the ocean are less likely than earthquakes that cause the sea floor to move vertically to produce tsunamis.

The San Andreas Fault, on the other hand, is mostly straight and could experience an even more explosive rupture than the Sulawesi quake.

BRITISH BOBBIES NOT ARMED

Improving police responses to domestic abuse

A new book co-authored by a City, University of London academic aims to improve the practice of policing domestic abuse

Book Announcement

CITY UNIVERSITY LONDON

Policing Domestic Abuse: Risk, Policy, and Practice 

IMAGE: POLICING DOMESTIC ABUSE: RISK, POLICY, AND PRACTICE, CO-AUTHORED BY CITY, UNIVERSITY OF LONDON'S DR RUTH WEIR view more 

CREDIT: ROUTLEDGE

A new book co-authored by a City, University of London academic aims to improve the practice of policing domestic abuse.

Dr Ruth Weir, a Senior Research Fellow in City’s Violence and Society Centre, has collaborated on Policing Domestic Abuse: Risk, Policy, and Practice with a view to informing those working in policing about the dynamics of how domestic abuse occurs, how best to respond to and investigate it, and, in the longer term, how to prevent it.

Dr Weir specialises in using police data to understand predictors and risk factors of domestic abuse.  She is currently working in the UKPRP Violence, Health and Society consortium (VISION) aimed at reducing the violence that harms health by improving the measurement and analysis of data on violence. Prior to working in academia, she held several research and policy positions in local government and the Home Office.

A launch event to mark the publication of the book will be held at City, University of London on 15 November 2022 from 6-8pm in The Pavilion (more details below), followed by a drinks reception.

Dr Weir said:

“This book is a unique collaboration of real-life policing experience blended with the latest academic research and best practice. 

“We hope that it becomes a core part of training for those working with victims and perpetrators of domestic abuse.”

Divided into thematic areas, the book uses recent research findings to update some of the theoretical analysis and to highlight areas of good practice: what works and why.

An effective investigation and the prosecution of offenders are considered, as well as an evaluation of the success of current treatment options.

Policing domestic abuse can only be dealt with through an effective partnership response. The book outlines the responsibilities of each agency and the statutory processes in place when policy is not adhered to.

Core content includes:

  • A critique of definitions and theoretical approaches to domestic abuse, including coverage of the myths surrounding domestic abuse and their impact on policing.
  • An exploration on the challenges of collecting data on domestic abuse, looking at police data and the role of health and victim support services.
  • A critical review of different forms of abuse, different perpetrators and victims, and risk assessment tools used by the police.
  • A critical examination of the law relating to domestic abuse; how police resources are deployed to respond to and manage it; and best practice in investigation, gathering evidence, and prosecution
  • Key perspectives on preventing domestic abuse, protecting victims, and reducing harm.

Written with the student and budding practitioner in mind, the book is filled with case studies, current research, reports, and media examples, as well as a variety of reflective questions and a glossary of key terms, to help shed light on the challenges of policing domestic violence and the links between academic research and best practice.

Dr Weir said:

“Eighteen per cent of crimes recorded by the police are domestic abuse related, but we know that this is just the tip of the iceberg with only one fifth of those experiencing abuse reporting their abuse to the police.

“We know that domestic abuse training is currently patchy and that the experiences that the general public have are variable. The book is therefore aimed at new and existing officers, and we hope that it will become a core part of police officer training.

“Next year one third of police officers will be new recruits, so this is a critical time to introduce the book into the policing curriculum.”

The authors hope that the book will achieve the following outcomes:

•           Better trained police officers who are more competent to respond to domestic abuse and therefore increase public perception and confidence in the police

•           Officers more confident to hold perpetrators accountable and to recognise, understand and support victims

•           Officers reflect on previous failings to avoid similar experiences

•           Officers work with other agencies to give coordinated solutions

•           Public have increased confidence in reporting

•           Recognition that domestic abuse is experienced by those within policing organisations too

•           A more consistent and higher quality response for victims

•           Earlier detection and reduced harm from domestic abuse.

Dr Weir has written the book alongside three co-authors:

Katy Barrow-Grint is a Chief Superintendent in Thames Valley Police with over 20 years of police experience. She has an academic interest in domestic abuse, completing her master’s at Warwick Business School, and has written on domestic abuse attrition rates in the criminal justice system.

Jacqueline Sebire is an Assistant Chief Constable with Bedfordshire Police. She has 30 years of police service specialising as a detective in homicide and safeguarding and public protection. She has a PhD in Psychology from the University of Leicester researching the risk factors associated with domestic abuse homicide.

Professor Jackie Turton is an Emeritus Professor in the Department of Sociology at University of Essex. She joined the university after a career in the health service and has taught sociology and criminology since 1996. Family violence is a key focus of her research work.

ENDS

Ancient genomes reveal hidden history of human adaptation

Peer-Reviewed Publication

UNIVERSITY OF ADELAIDE

The use of ancient DNA, including samples of human remains around 45,000 years old, has shed light on a previously unknown aspect of human evolution.

Dr Yassine Souilmi, Group Leader at the University of Adelaide’s Australian Centre for Ancient DNA, co-led the new study published in Nature Ecology and Evolution.

“It was widely believed the genetics of our human ancestors didn’t change due to environmental pressures as much as other animals, due to our enhanced communication skills and ability to make and use tools,” Dr Souilmi said.

“However, by comparing modern genomes with ancient DNA, we discovered more than 50 cases of an initially rare beneficial genetic variant becoming prevalent across all members of ancient human groups.

“In contrast to many other species, evidence for this type of adaptive genetic change has been inconsistent in humans. This discovery consequently challenges the prevailing view of human adaptation, and gives us a new and exciting insight into how humans have adapted to the novel environmental pressures they encountered as we spread across the planet.”

Co-lead author Dr Ray Tobler - an Adjunct Fellow at the University of Adelaide and a DECRA fellow at the Australian National University – said examining ancient DNA has been critical in unlocking the secrets of human evolution.

“We believed historical mixing events between human groups might have hidden signs of genetic changes in modern human genomes,” Dr Tobler said.

“We examined DNA from more than 1,000 ancient genomes, the oldest which was around 45,000 years old, to see if certain types of genetic adaptation had been more common in our history than studies of modern genomes had suggested.”

Professor Christian Huber, a senior author of the research paper, is an Adjunct Fellow at the University of Adelaide and an Assistant Professor at Penn State University.

“The use of ancient genomes was crucial because they preceded major historical mixing events that have radically reshaped modern European genetic ancestry,” Professor Huber said.

“This allowed the recovery of historical signs of adaptation that are invisible to standard analysis of modern genomes.”

Established in 2005, the Australian Centre for Ancient DNA is a world leader in the research and development of advanced ancient DNA approaches for evolutionary, environmental and conservation applications.

Researchers based at the Mayo Clinic, the Garvan Institute of Medical Research, the Max Planck Institute for the Science of Human History in Germany, the University of New South Wales, and Massey University in New Zealand also contributed to the research paper.

Geothermal could become workhorse of the energy transition

Climate tech panelists discuss field’s huge potential and challenges

Reports and Proceedings

SCIENCE COMMUNICATIONS

SOSV Climate Tech Summit 2022 panelists 

IMAGE: PARTICIPANTS IN "IS THIS GEOTHERMAL'S MOMENT," A PANEL DISCUSSION AT THE SOSV CLIMATE TECH SUMMIT 2022. view more 

CREDIT: SOSV

CAMBRIDGE, MA--Geothermal energy—the heat deep below our feet—has the potential to become the workhorse of the energy transition as it grows from supplying just 0.4 percent of the world’s energy today to upward of 20, 30, and even 50 percent by 2050. That’s according to the co-founder and CEO of a company working to tap the mother lode of that energy who presented at the SOSV Climate Tech Summit 2022 held October 25-26. SOSV is a global venture capital firm.

 

Carlos Araque of Quaise Energy made those remarks during a panel discussion titled, “Is this geothermal’s moment?” Quaise is developing a unique drilling technique to reach the hot rock some two to 12 miles beneath the Earth’s surface. Araque was joined by Kathy Hannun, co-founder and president of Dandelion Energy, a firm working on a different “flavor” of geothermal, or using it to heat and cool residential homes today. The Dandelion process uses established technology that doesn’t require such deep drilling.

 

“I didn’t know much about geothermal until I started diving into [it] for this panel,” said Moderator Candice Ammori, founder of The Climate Vine, which advises climate tech startups. “I’m excited to say now that I’m a believer in geothermal. I think there’s a fair amount of hype…but I actually think that there probably should be more hype.”

 

Araque and Hannun went on to not only describe the biggest barriers to scaling up their businesses for the world, but also what other geothermal problems they are “itching for people to solve,” according to Ammori.

 

First, however, the two described why geothermal could be so important. In addition to being clean and global, geothermal provides a baseload energy source that’s available 24/7, even if it’s cloudy out or there’s no wind. It’s also “the most powerful and abundant renewable on Earth,” said Araque, “much more so than wind, solar, nuclear, and all fossil fuels combined.”

 

In addition, Araque said, it’s important to weigh an energy source by its impact on externalities like the environment, land use, and mineral use. “When you look at the [problem] from this lens—how much land use per unit of energy you produce, the amount of materials necessary per unit of energy, and how much carbon dioxide you produce per unit of energy—you start realizing that geothermal comes out way, way ahead of anything else.”

 

Barriers and Solutions

 

To fully tap the resource, however, will be very capital intensive and time intensive. “It’s very hard to achieve anything in our space with a million dollars or even $10 million,” Araque said. “You have to start playing at the $100 million level or even $1 billion level. This is what it costs to get [deep geothermal] developed and deployed at portfolio levels.”

 

Further, the Quaise technology involved in deep drilling has been demonstrated in the lab, but not yet in the field. And that will take time.

 

However, Araque said that by the end of the decade Quaise aims to create power from a coal- or gas-fired power plant that has been converted to geothermal. “You feed in geothermal steam instead of steam from a fossil-fuel boiler. That in a brushstroke decarbonizes the power plant, and you can repeat that 10,000 times over with other plants.”

 

The key to making deep geothermal a reality? “You leverage the oil and gas industry,” said Araque, who himself comes from that industry. “I think of them as a ready-made workforce, supply chain, and regulatory framework that can push this into the world at the scale that’s required.”

 

Hannun noted that for Dandelion, simplifying complexity will be key to bringing down the costs associated with using geothermal for heating and cooling of residential homes. “It’s hard to advance our building stock and change all of the buildings that already exist [to geothermal because] they’re all slightly different and there’s a lot of complexity to manage. So a lot of our focus is on making geothermal [heat pumps] as simple to get into homes as it is to install a furnace or air conditioner.”

 

Room for Entrepreneurs!

 

Ammori ended the session by asking Hannun and Araque about remaining geothermal challenges that other entrepreneurs could tackle. Both agreed that better imaging systems to see underground are important. For deep geothermal, Araque said that there’s a need for electronics that can withstand the high temperatures associated with the resource. Hannun noted that anything related to weatherizing homes will help the geothermal heating and cooling industry.

 

She also stressed that for both her and Araque’s industries, “I would encourage entrepreneurs not to just look at the central core technology, but also the enabling technologies, products, or businesses around permitting, licensing, and transmission. There are [many] things in the ecosystem that need to happen to enable scale.”

 

Araque concluded by noting that the energy transition itself is an unsolved problem. “Don’t for a second think that it’s just a matter of scaling what we have. There’s plenty of space for innovation. This is the greatest challenge of many generations, not just ours, and we need all human capital on the problem.”

 

Watch “Is this geothermal’s moment?” | SOSV Climate Tech Summit 2022


Efficiency and stability best-practices proposed for solar water-splitting to make hydrogen

NREL, Berkeley Lab offer ideal methods to provide confidence in comparing measurements

Peer-Reviewed Publication

DOE/NATIONAL RENEWABLE ENERGY LABORATORY

Scientists from the U.S. Department of Energy’s National Renewable Energy Laboratory (NREL) and Lawrence Berkeley National Laboratory (Berkeley Lab) are providing researchers with a guide to how to best measure the efficiency of producing hydrogen directly from solar power.

Photoelectrochemical (PEC) water-splitting, which relies on sunlight to split water into its component elements—oxygen and hydrogen—stands out as potentially one of the most sustainable routes to clean energy. Measurements of how efficient the PEC process is on an identical system can vary wildly from different laboratories, however, from a lack of standardized methods. The newly developed best-practices guide published in Frontiers in Energy Research is intended to provide confidence in comparing results obtained at different sites and by different groups.

The publication provides a road map for the PEC community as researchers continue to refine the technology. These best practices were verified by both laboratories via round-robin testing using the same testing hardware, PEC photoelectrodes, and measurement procedures. Research into photovoltaics has allowed a certification of cell efficiencies, but PEC water-splitting efficiency measurements do not yet have a widely accepted protocol.

“It’s really difficult to compare reported PEC water-splitting efficiency results between labs, because people tend to make measurements under different conditions,” said Todd Deutsch, a senior scientist at NREL and co-author of the new journal article, “Best practices in PEC: How to reliably measure solar-to-hydrogen efficiency of photocathodes.” “The Department of Energy recognized this a while ago, so there have been quite a few efforts to establish standards that we’ve been involved in—multi-lab collaborative efforts and also NREL-specific efforts.”

Other authors from NREL are Keenan Wyatt, Myles Steiner, and James Young.

"The motivation for this protocol paper was both to serve as a guide for researchers just entering the field as well as describing subtle technique tips for more experienced scientists,” said Francesca Toma, a materials staff scientist at Berkeley Lab and a co-author of the journal article. “We leveraged the unique strengths of two national labs that together span the basic to applied science realms.”

Other co-authors from Berkeley Lab are Olivia Alley, Guiji Liu, Tobias Kistler, David Larson, and Jason Cooper.

The article spells out the path so that all laboratories can follow a uniformity of experimental practices, beginning with the materials needed for the fabrication of photoelectrodes. The authors go on to detail the fabrication procedure, the experimental setup, and the process to measure the solar-to-hydrogen (STH) efficiency. Direct measurement of the amount of hydrogen generated by PEC water-splitting is required for an accurate characterization of STH efficiency, the researchers noted.

PEC water-splitting was first noted in scientific publications in 1972. Research since then has continued to refine and improve the process, but so far, no standardized STH measurement procedures have been established. NREL set the first record for STH efficiency exceeding 10% (12.4% STH) in 1998, but in 2016 revised that figure downward in a publication that described common pitfalls to avoid in making efficiency measurements, after realizing the original experiment had been over-illuminated. In 2017, the team used bandgap engineering to design light absorbers more optimized to utilize the solar spectrum, which resulted in a higher STH of 16.2%, a new world record at the time.

The U.S. Department of Energy’s Hydrogen and Fuel Cell Technologies Office has set 25% as the ultimate target for STH through PEC water-splitting, although preliminary cost analysis suggests that competitively costed hydrogen could be achieved with lower efficiencies. Photoelectrodes have demonstrated efficiencies from 10% to 20%.

PEC researchers also continue to work on improvements to durability. The semiconductor used to capture sunlight is immersed in an aqueous (water-based) electrolyte. But with electrolyte pH ranging from acidic to alkaline, the electrolyte corrodes the semiconductor and shortens its lifespan.

“Durability still is pretty much a showstopper for this technology,” Deutsch said. “There’s been some progress, but not nearly as much as there has been recently in improving efficiency.”

Deutsch co-authored a new paper, also in Frontiers in Energy Research, on “Long-Term Stability Metrics of Photoelectrochemical Water Splitting” that describes achieving the simultaneous highly efficient and stable unassisted PEC water-splitting as the “Holy Grail” in clean and renewable fuel generation. That paper provides a framework for conducting long-term stability experiments with the hopes of achieving ultrahigh stability (lasting more than 10,000 hours) and efficiency greater than 15%.

HydroGEN Advanced Water Splitting Materials Consortium, established as part of the Energy Materials Network under the U.S. Department of Energy Office of Energy Efficiency and Renewable Energy's Hydrogen and Fuel Cell Technologies Office, funded the research.

NREL is the U.S. Department of Energy's primary national laboratory for renewable energy and energy efficiency research and development. NREL is operated for the Energy Department by the Alliance for Sustainable Energy, LLC.

Pitt mathematicians explain how some fireflies flash in sync









Peer-Reviewed Publication

UNIVERSITY OF PITTSBURGH

Stake out in Pennsylvania’s Cook State Forest at the right time of year and you can see one of nature’s great light shows: swarms of fireflies that synchronize their flashes like strings of Christmas lights in the dark.

A new study by Pitt mathematicians shows that math borrowed from neuroscience can describe how swarms of these unique insects coordinate their light show, capturing key details about how they behave in the wild.

“This firefly has a quick sequence of flashes, and then a big pause before the next burst,” said Jonathan Rubin, professor and chair of the Department of Mathematics in the Kenneth P. Dietrich School of Arts and Sciences. “We knew a good framework for modeling this that could capture a lot of the features, and we were curious how far we could push it.”

Male fireflies produce a glow from their abdomens to call out to potential mates, sending out blinking patterns in the dark to woo females of their own species. Synchronous fireflies of the species Photinus carolinus take it a step further, coordinating their blinking throughout entire swarms. It’s a rare trait — there are only a handful of such species in North America — and the striking lights they produce draw crowds to locations where the insects are known to gather.

They’ve also attracted the interest of mathematicians seeking to understand how they synchronize their blinks. It’s just one example of how synchronization can evolve from randomness, a process that has intrigued mathematicians for centuries. One famous example from the 1600s showed that pendulum clocks hung next to one another synchronize through vibrations that travel through the wall, and the same branch of math can be used to describe everything from the action of intestines to audience members clapping.








“Synchrony is important for a lot of things, good and bad,” said co-author Bard Ermentrout, distinguished professor of mathematics in the Dietrich School. “Physicists, mathematicians, we’re all interested in synchronization.”

To crack the fireflies’ light show, the Pitt team used a more complex model called an “elliptic burster” that’s used to describe the behavior of brain cells. The duo, along with then-undergrad Madeline McCrea (A&S ’22) published details of their model Oct. 26 in the Journal of the Royal Society Interface.  

The first step was to simulate the blinks of a single firefly, then expand to a pair to see how they matched their flashing rates to one another. Next, the team moved to a bigger swarm of simulated insects to see how number, distance and flying speed affect the resulting blinks.

Varying the distances each firefly could “see” each other and respond to one another changed the insects’ light show, they found: By tweaking the parameters, they could produce patterns of blinks that looked like either ripples or spirals.

The results line up with several recently published observations about real-life synchronous fireflies — for instance, that individual fireflies are inconsistent while groups flash more regularly, and that when new fireflies join the swarm, they’re already perfectly in time.

“It captured a lot of the finer details that they saw in the biology, which was cool,” said Ermentrout. “We didn't expect that.”

The math also makes some predictions that could inform firefly research — for instance, light pollution and the time of day both may alter the patterns produced by fireflies by changing how well they can see one another’s blinks.

McCrea worked on the research as an undergraduate supported by the department’s Painter Fellowship, which gave her funding to work on the project through the summer. “She was awesome working on this project, and really persistent,” said Rubin.

The team is the first to use this particular brain-cell framework to model fireflies, which several different research teams are trying to understand using different types of math. “It’s more of a wild west research topic,” said Ermentrout. “It’s early days, and who knows where things are going to go from here?”

Ermentrout and Rubin also hopeful that the math will capture the imagination of those inspired by the glow of fireflies. In the midst of this project, Rubin himself decided to head up to Cook State Forest to see if he could spot his research subjects firsthand.   

“I convinced my wife to go on a trip for a couple of days right in the peak of the season,” he said. “It’s not clear we ever saw synchronized activity, but there were all sorts of fireflies around us. It was amazing.”





Study finds high prevalence of COVID-19 and flu co-infections during 2021-2022 flu season

Differences discovered in co-infection prevalence between Delta and Omicron variants

Peer-Reviewed Publication

UNIVERSITY OF MISSOURI-COLUMBIA

Researchers from the University of Missouri School of Medicine have discovered a high prevalence of COVID-19 co-infections in central Missouri during the 2021-2022 flu season, with a monthly co-infection rate as high as 48% among individuals with COVID-19.  

The findings come from 462 patients at University of Missouri Health Care who tested positive for COVID-19 and were subsequently tested for influenza. Of those who tested positive for COVID-19, 33% also tested positive for the flu.  

“Co-infection in our samples peaked in October 2021 at 48% when the Delta variant was dominant, and reached the lowest point at 7.1% in January 2022 when the Omicron variant prevailed,” said senior author Henry Wan, PhD, professor of Molecular Microbiology and Immunology, Veterinary Pathobiology, Electrical Engineering and Computer Science. Wan also directs the NextGen Center for Influenza and Emerging Infectious Diseases and is a primary investigator at the Bond Life Sciences Center. 

Of the 462 patients infected with COVID-19, 51% had the Delta variant, while 38% had the Omicron variant. Those who were infected with the Omicron variant and those who received at least one influenza vaccine during the 2020-2022 influenza seasons were less likely to become co-infected with both the flu and COVID-19. They were also less likely to become hospitalized. 

“Despite low flu vaccine effectiveness for the 2021-2022 season, which was estimated at less than 16%, our study highlights the importance of influenza vaccinations, as they appear to not only offer some protection against influenza infections but importantly, against COVID-19 and flu co-infections,” Wan said.  

Future studies involving a wider geographic area and a more diverse population will be needed to provide more clarity on the prevalence of overall flu and COVID-19 co-infections and the effectiveness of both COVID-19 and flu vaccines.  

“Testing for both flu and COVID-19 viruses in patients experiencing symptoms of respiratory illness and vaccinations against both viruses should continue to be encouraged,” Wan said. 

 In addition to Wan, the study authors include MU colleagues Cynthia Tang, a MD/PhD student; Maria Boftsi, PhD, postdoctoral research fellow; Lindsay Staudt, medical student and research specialist; Sabrina Duong, a medical student; Jane McElroy, PhD, professor of family and community medicine; Adrienne Ohler, PhD, associate research professor; Detlef Ritter, MD, professor of clinical pathology; and Richard Hammer, MD, professor of pathology. 

Their study, “SARS-CoV-2 and influenza co-infection; a cross sectional study in central Missouri during the 2021-2022 influenza season,” was recently published in the journal Virology.  

This work was supported by the National Institutes of Health. The authors disclose no conflicts of interest. 

CItyU researchers invent smart mask to track respiratory sounds for respiratory disease identification

Peer-Reviewed Publication

CITY UNIVERSITY OF HONG KONG

CityU smart mask 

VIDEO: NANOCOMPOSITE-SENSOR INTEGRATED SMART MASK ALLOWS THE DETECTION OF A WIDE-BANDWIDTH DYNAMIC PRESSURE RANGE WITH HIGH SENSITIVITY FOR TRACKING MULTIPHASE RESPIRATORY ACTIVITY. view more 

CREDIT: WILEY EDITING SERVICES / DOI: 10.1002/ADVS.202203565

Wearing face masks has been recognised as one of the most effective ways to prevent the spread of COVID-19, even in its coming endemic phase. Apart from the conventional function of masks, the potential for smart masks to monitor human physiological signals is being increasingly explored. A research team led by the City University of Hong Kong (CityU) recently invented a smart mask, integrating an ultrathin nanocomposite sponge structure-based soundwave sensor, which is capable of detecting respiratory sounds of breathing, coughing and speaking.

Using machine-learning algorithms and a high sensitivity soundwave sensor operable across a wide bandwidth, the smart mask has opened new avenues for its application in the identification of respiratory diseases, as well as a voice interaction tool. This ultra-lightweight wearable technology also has the potential to improve personal and public health by enabling prolonged and systematic respiratory health monitoring in daily life.

A research team led by Professor Li Wenjung, Chair Professor in the Department of Mechanical Engineering (MNE), Professor Wang Jianping, Professor in the Department of Computer Science (CS), and Dr Yu Xinge, Associate Professor in the Department of Biomedical Engineering (BME) at CityU, recently developed this smart mask, which can detect and distinguish multiple respiratory actions. Professor Shen Jiangang’s team from the School of Chinese Medicine of The University of Hong Kong also made an important contribution to the project. The findings were published in Advanced Science under the title “Wide-bandwidth nanocomposite-sensor integrated smart mask for tracking multiphase respiratory activities”.

Importance of wearing masks even if COVID-19 becomes endemic

“Many countries now believe that COVID-19 will soon become endemic,” said Professor Li. “However, we must set aside optimism and be realistic about the likely levels of illness, disability and death associated with this disease in the coming years. It is important to remember that endemicity does not correspond to harmlessness.” He used malaria as an example to illustrate that even though it is currently considered endemic in 87 countries, in 2020, it infected an estimated 241 million people and caused 627,000 deaths, according to the World Health Organization. Thus, he suggested that people should continue to be cautious about COVID-19 and use available and proven measures, including masks, to control the spread of the virus.

“This smart mask utilises our self-developed, high-sensitivity, wide-bandwidth flexible sensor that can detect and record daily human respiratory activity, such as breathing, coughing and speaking for cloud data storage,” explained Professor Li.

The smart mask developed by the team has a sponge-like structure as thin as 400ÎĽm, fabricated with carbon nanotube and polydimethylsiloxane (CNT/PDMS) materials, using the team’s novel modified sacrificial-release technique. The ultra-thin, lightweight sensor can be practically integrated and work effectively with both rigid masks and deformable non-woven fabric masks.

Good performance in static and dynamic pressure

The research team recruited 31 people in order to collect their respiratory activity while they wore the smart mask. The findings showed that the acoustic wave sensor was highly sensitive in measuring both static and dynamic pressure. Besides performing well in the static pressure range of 27.9 Pa – 2.5 kPa, the sensor also responded to high-frequency dynamic pressure generated by the human voice, i.e., sound harmonic acoustic energy up to 4000 Hz.  In addition, the sensor can sense air movement, including directional flow and vibration. These findings suggest that the sensor could be used to detect human respiratory activity by integrating it with a commercial polycarbonate mask. It also demonstrated that the smart mask could detect and differentiate three common respiratory functions: breathing, coughing and speaking.

“Advanced artificial intelligence technology enables the integrated mask to recognise different coughing and breathing patterns automatically, indicating its potential use to diagnose respiratory-related diseases in the future,” said Professor Wang. “Presently, researchers use commercial sensors to detect temperature changes and airflow to count the number of coughs, but they cannot capture important physiological information contained in the human voice, coughing and breathing. Our smart mask is sensitive to both subtle air pressure and high-frequency vibrations and can detect three phrases of coughing,” added Professor Li.

The team aims to eventually develop real-time diagnostics algorithms for applications such as pneumoconiosis symptom assessment. “As a potentially low-cost, daily smart wearable device, this new IoT smart mask will help personal and public health management of respiratory disease screening and diagnosis, especially in cities with a dense population, like Hong Kong,” said Dr Yu. The speech-detection ability of the smart mask can also help resolve the sound attenuation problem caused by wearing masks.

The first co-authors of the paper are Miss Suo JiaoMr Liu Yifan and Dr Wu Cong, all of whom are Professor Li’s students. Corresponding co-authors include Dr YuProfessor Wang and Professor Li from CityU. Other team members from CityU include Dr Walid Daoud and Dr Yang Zhengbao from the MNE and Dr Li Xinyue from the School of Data Science.

The research was supported by the Shenzhen Municipality Science and Technology Innovation Commission, the Hong Kong Research Grants Council, and the Hong Kong Centre for Cerebro-cardiovascular Health Engineering.