Wednesday, January 17, 2024

 

Nonpharmaceutical interventions saved lives and eased burdens during COVID’s first wave, new study shows


James Peters and Mohsen Farhadloo say masking, shelter-in-place and other measures reduced growth rates of deaths, case numbers and hospitalizations in early 2020


Peer-Reviewed Publication

CONCORDIA UNIVERSITY

Mohsen Farhadloo and James Peters 

IMAGE: 

MOHSEN FARHADLOO (LEFT) AND JAMES PETERS: “WHEN YOU SCALE THESE NUMBERS UP TO THE MILLIONS, THESE MEASURES COULD BE PREVENTING HUNDREDS OR THOUSANDS OF DEATHS.”

view more 

CREDIT: CONCORDIA UNIVERSITY





The measures world governments enacted at the outset of the COVID-19 pandemic in early 2020 remain a source of controversy for policy experts, researchers and media commentators. Some research maintains that they did little to cut down mortality rates or halt the virus’s spread.

However, a new study by Concordia PhD student James Peters and assistant professor Mohsen Farhadloo in the Department of Supply Chain and Business Technology Management at the John Molson School of Business says otherwise.

According to Peters and Farhadloo, some of these studies do not account for the effectiveness of nonpharmaceutical interventions in other aspects, such as decreases in hospitalizations and overall number of cases. Other studies overlooked data from separate time frames after implementation, essentially taking a snapshot of a situation and extrapolating conclusions.

Writing in the journal AJPM Focus, Peters and Farhadloo note that nonpharmaceutical interventions were in fact effective at reducing the growth rates of deaths, cases and hospitalizations during the pandemic’s first wave.

The researchers say they hope that their findings will dispel some falsehoods that continue to circulate to this day.

Small numbers have a big effect

The researchers conducted a systematic literature review of 44 papers from three separate databases that used data from the first six months of the pandemic. They concentrated on this timeframe because, by fall 2020, the second wave had emerged and governments and individuals had changed their behaviours, having had time to adapt to the measures.

Peters and Farhadloo harmonized the various metrics used across the papers and divided the different kinds of measures into 10 categories. They then measured their effectiveness on case numbers, hospitalization and deaths over two, three or four, and more weeks after implementation.

Among other results, the researchers found that:

  • Masks were associated with decreases in cases and deaths.
  • Closing schools and businesses resulted in lower per capita deaths, but those effects decreased after four weeks.
  • Restaurant/bar closures and travel restrictions corresponded to decreases in mortality after four weeks.
  • Shelter-in-place orders (SIPOs) resulted in fewer cases but only after a delay of two weeks.
  • SIPOs and mask wearing were associated with reducing the healthcare burden.
  • Policy stringency, SIPOs, mask wearing, limited gatherings and school closures were associated with reduced mortality rates and slower case number growth rates.

“We found that wearing masks led to an estimated reduction of about 2.76 cases per 100,000 people and 0.19 in mortality. These effects sound small but are statistically significant,” Peters explains.

“When you scale these numbers up to the millions, these measures could be preventing hundreds or thousands of deaths.”

Farhadloo adds that understanding the usefulness of these measures can help counter the growth of misinformation online.

“We started this project in 2022, while COVID health measures were still in place. At that time, some people were citing research saying that these measures were not effective. But the scientific research articles they were referring to were flawed.

“We wanted to respond to the existing misinformation and disinformation that was being disseminated on social media by raising awareness about it.”

Peters believes that the paper, which looks at effectiveness over a longer time span than most previous studies, can inform policy makers in the future.

“If and when another pandemic occurs, we should be more prepared. We should know which policies are most effective at mitigating not only mortality but cases and hospitalizations as well.”

Read the cited paper: “The Effects of Nonpharmaceutical Interventions on COVID-19 Cases, Hospitalizations, and Mortality: A Systematic Literature Review and Meta-analysis

 

Cannabis activates specific hunger neurons in brain



Peer-Reviewed Publication

WASHINGTON STATE UNIVERSITY


PULLMAN, Wash. – While it is well known that cannabis can cause the munchies, researchers have now revealed a mechanism in the brain that promotes appetite in a set of animal studies at Washington State University.

The discovery, detailed in the journal Scientific Reports, could pave the way for refined therapeutics to treat appetite disorders faced by cancer patients as well as anorexia and potentially obesity.

After exposing mice to vaporized cannabis sativa, researchers used calcium imaging technology, which is similar to a brain MRI, to determine how their brain cells responded. They observed that cannabis activated a set of cells in the hypothalamus when the rodents anticipated and consumed palatable food that were not activated in unexposed mice.

“When the mice are given cannabis, neurons come on that typically are not active,” said Jon Davis, an assistant professor of neuroscience at WSU and corresponding author on the paper. “There is something important happening in the hypothalamus after vapor cannabis.”

Calcium imaging has been used to study the brain’s reactions to food by other researchers, but this is the first known study to use it to understand those features following cannabis exposure.

As part of this research, the researchers also determined that the cannabinoid-1 receptor, a known cannabis target, controlled the activity of a well-known set of “feeding” cells in the hypothalamus, called Agouti Related Protein neurons. With this information, they used a “chemogenetic” technique, which acts like a molecular light switch, to home in on these neurons when animals were exposed to cannabis. When these neurons were turned off, cannabis no longer promoted appetite.

“We now know one of the ways that the brain responds to recreational-type cannabis to promote appetite,” said Davis.

This work builds on previous research on cannabis and appetite from Davis’ lab, which was among the first to use whole vaporized cannabis plant matter in animal studies instead of injected THC—in an effort to better mimic how cannabis is used by humans.  In the previous work, the researchers identified genetic changes in the hypothalamus in response to cannabis, so in this study, Davis and his colleagues focused on that area.

The current research received support from the Alcohol and Drug Abuse Research Program, the National Institute on Alcohol Abuse and Alcoholism, and the U.S. Department of Agriculture as well as by funds provided by the state of Washington Initiative Measure No. 171.

 

Calibrating from the cosmos


A newly funded calibration service will take LuSEE-Night to ultimate precision, paving the way for next-generation radio cosmology experiments


Business Announcement

DOE/BROOKHAVEN NATIONAL LABORATORY





A unique “passenger” is joining an upcoming mission to the moon.

In 2026, physicists are planning to operate a radio telescope on the far side of the moon—an unforgiving environment that poses tremendous challenges for research equipment to survive, but also the promise of enormous scientific payoff. Called LuSEE-Night, the project aims to access lingering radio waves from the universe’s ancient past, peering into an era of the cosmos that’s never been observed before.

Now, thanks to new funding from NASA, the project has added a state-of-the-art calibrator to the mission. This calibrator will not only ensure measurements from LuSEE-Night are accurate but also set the stage for more sophisticated telescopes to reside beyond Earth.

A cosmologist’s dream calibrator

All telescopes require calibration—a system of assessing the quality and wavelengths of light they collect—but calibrating LuSEE-Night is a substantial challenge.

First, the range of calibration techniques that can be applied to LuSEE-Night is far more limited than what is available for optical telescopes. Optical telescopes can move; they can focus on a star, look away, and then move back. By collecting measurements of known and unknown celestial objects, scientists can compare the two as a method of calibration. LuSEE-Night, on the other hand, will be completely stationary, operating with fixed antennas that “view” the entire sky at once.

So, how do scientists typically calibrate radio telescopes? They move the signal, not the telescope.

For traditional, ground-based radio telescopes, scientists have often tried to send a point source, usually an artificial radio source mounted on a drone, above the telescope. As the drone crisscrosses through the sky over the telescope, scientists can observe how the telescope responds and calibrate the instrument accordingly. But the way drones move and the chance for them to be blown off course by the wind makes it challenging to capture precise measurements. Not only is achieving this level of precision a necessity for a far-off lunar telescope like LuSEE-Night but flying drones from the moon is just not feasible.

LuSEE-Night will also take on the challenge of exclusively measuring very faint, low-frequency radio waves.

“The lower the radio frequency you’re trying to measure, the harder the instrument is to calibrate,” said LuSEE-Night science collaboration spokesperson Anže Slosar, a physicist at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory. LuSEE-Night is a collaborative effort between NASA and DOE, with Brookhaven Lab leading DOE’s role in the project and DOE’s Lawrence Berkeley National Lab providing key technical support.

All these challenges encouraged NASA to support a cosmologist’s dream calibrator: a calibrator in orbit around the moon. NASA achieved this by purchasing a calibration service in the free market.

“The calibration service will be coming from a satellite in orbit. It is like the ultimate drone, an ideal point source,” Slosar said. “You know exactly where it is; it is very stable and it is, for all practical purposes, infinitely far away—the same as real celestial sources.”

Launching a satellite into orbit is far too expensive for calibrating ground-based telescopes.

“It’s still not an easy task, but with DOE and NASA collaborating, we made it possible,” Slosar said.

Teamwork makes the dream work

The development and launch of the calibrator, like the rest of the LuSEE-Night project, relies on NASA’s Commercial Lunar Payload Services (CLPS) initiative. Through CLPS, NASA contracts private companies to carry out low-cost missions to the moon. And now, NASA has contracted Firefly Aerospace, Inc., the company already tasked with launching LuSEE-Night, to build the new calibrator; it’s the first time NASA has asked for a calibration service from the CLPS pool of providers. 

“The calibrator will be a sophisticated radiofrequency transmitter with a downward-looking antenna,” said Paul O’Connor, a senior scientist in Brookhaven’s Instrumentation Division and LuSEE-Night Project Instrument Scientist. “It will be in lunar orbit and emit a calibration signal every time it rises above the horizon, and LuSEE-Night will pick the signal up. Because we will always know exactly where the calibrator is and its signal intensity, we will also know exactly how much space radiation is coming from each direction we are studying. This will enable us to understand the nuances of our instrument’s response, such as its sensitivity to polarization and how the incoming radiation interacts with the lunar regolith.”

This design will enable the LuSEE-Night collaboration to achieve “absolute calibration,” which Slosar says can rarely be achieved from the ground, let alone from the moon. Scientists expect the calibrator to reduce uncertainty from 20% to about 1%.

“While the basic technique is similar to that of drone calibration, this technique is ultimately much more sophisticated,” Slosar added. “Instead of blinking or beeping an intermittent noise that we would have to distinguish from other noises in space, this calibrator will give us a known signal we can easily recognize, even when it is drowned in the much brighter emission from our own galaxy.”

Ready for launch

The calibrator will travel into space on the same rocket as LuSEE-Night, becoming the latest passenger among a suite of scientific instruments headed to the moon—each with its own destination and timed arrival.

“When the transfer vehicle gets close to the moon, first, the landing equipment and the European Space Agency’s Lunar Pathfinder communications satellite will detach and go into orbit. Then, the lander will shuttle the telescope to the moon’s surface. Finally, the communications module for the lander and the calibrator go into orbit, where the calibrator will remain,” Slosar said.

Five Earth days after LuSEE-Night lands on the lunar far side, Firefly Aerospace will remotely turn on the calibrator to ensure it is working. Since the lander will still be emitting interfering signals, these early data will require careful analysis. But once the first lunar sunset arrives and the lander turns off, then the true scientific mission of LuSEE-Night begins.

After 50 Earth days, the team will have gathered sufficient data from the telescope to achieve single-percent-level calibration.

“Our instruments are set up to do calibrations and normal science operations simultaneously so that we can collect data throughout the first lunar night,” Slosar said. 

By the second lunar night, the calibrator gets switched off because, in addition to demonstrating the calibration technique, launching this satellite into orbit is also an exercise in international relations.

“People long ago realized that the far side of the moon is unique space. It is one of the most radio-quiet places in existence,” Slosar said. “Therefore, international treaties were signed, stating that nobody should pollute the lunar spectrum at radio frequencies below 300 megahertz, which are the most precious for radio astronomy. But now we have this calibrator that will emit radio frequencies, so the Federal Communications Commission must request a time-limited waiver from the International Telecommunications Union. In this case, we will have one and a half lunar days, or 50 Earth days, before it must switch off.”

As the 50-Earth-day clock ticks down, scientists at Brookhaven Lab, empowered by interagency collaboration and public-private partnerships, will carry out one of the most ambitious radio cosmology experiments in history. Their work could help uncover answers to some of the universe’s biggest mysteries, such as the formation of the universe itself.

Brookhaven National Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.

Follow @BrookhavenLab on social media. Find us on InstagramLinkedInX, and Facebook

 

Largest-ever study of ocean DNA has created essential catalog of marine life

From biotechnology innovation to tracking climate change impacts, the KMAP Global Ocean Gene Catalog 1.0 offers diverse applications for science and society

Peer-Reviewed Publication

FRONTIERS

The KMAP Ocean Gene Catalog 1.0 is the largest database of marine microbes to date 

IMAGE: 

THE KMAP OCEAN GENE CATALOG 1.0 IS THE LARGEST DATABASE OF MARINE MICROBES TO DATE

view more 

CREDIT: THE KMAP OCEAN GENE CATALOG 1.0 IS THE LARGEST DATABASE OF MARINE MICROBES TO DATE

The ocean is the world’s largest habitat, yet much of its biodiversity is still unknown. A study published in Frontiers in Science marks a significant breakthrough, reporting the largest and most comprehensive database of marine microbes to date – matched with biological function, location, and habitat type.  

“The KMAP Global Ocean Gene Catalog 1.0 is a leap toward understanding the ocean’s full diversity, containing more than 317 million gene groups from marine organisms around the world,” said lead author Elisa Laiolo of the King Abdullah University of Science and Technology (KAUST) in Saudi Arabia. “The catalog focuses on marine microbes, which greatly impact human lives through their influence on the ocean’s health and the Earth’s climate.”  

“The catalog is freely available through the KAUST Metagenomic Analysis Platform (KMAP),” added the study’s senior author, Prof Carlos Duarte, a faculty member at KAUST. “Scientists can access the catalog remotely to investigate how different ocean ecosystems work, track the impact of pollution and global warming, and search for biotechnology applications such as new antibiotics or new ways to break down plastics – the possibilities are endless!” 

A feat of technological innovation and scientific collaboration 

Researchers have been mapping marine biodiversity for hundreds of years, but faced various challenges to creating a full atlas of ocean life. One is that most marine organisms cannot be studied in a laboratory. The advent of DNA sequencing technologies overcame this by allowing organisms to be identified directly from ocean water and sediments.  

“Since each species has its own set of genes, we can identify which organisms are in an ocean sample by analyzing its genetic material,” Laiolo explained. “Two technological advances have made this possible at scale.  

“The first is the enormous increase in speed, and decrease in cost, of DNA sequencing technologies. This has allowed researchers to sequence all the genetic material in thousands of ocean samples.”  

“The second is the development of massive computational power and AI technologies, which make it possible to analyze these millions of sequences.” 

The team used KMAP to scan DNA sequences from 2,102 ocean samples taken at different depths and locations around the world. This advanced computing infrastructure identified 317.5 million gene groups, of which more than half could be classified according to organism type and gene function. By matching this information with the sample location and habitat type, the resulting catalog provides unprecedented information on which microbes live where and what they do. 

“This achievement reflects the critical importance of open science,” said Duarte. “Building the catalog was only possible thanks to ambitious global sailing expeditions where the samples were collected and the sharing of the samples’ DNA in the open-access European Nucleotide Archive. We are continuing these collaborative efforts by making the catalog freely available.” 

A wealth of scientific and industrial applications 

The catalog has already revealed a difference in microbial activity in the water column and ocean floor, as well as a surprising number of fungi living in the ‘twilight’ mesopelagic zone. These and other insights will help scientists understand how microbes living in different habitats shape ecosystems, contribute to ocean health, and influence the climate.  

The catalog also serves as a baseline for tracking the effect of human impacts like pollution and global warming on marine life. And it offers a wealth of genetic material that researchers can scan for novel genes that could be used for drug development, energy, and agriculture. 



Toward a global ocean genome 

The KMAP Ocean Gene Catalog 1.0 is a first step towards developing an atlas of the global ocean genome, which will document every gene from every marine species worldwide – from bacteria and fungi to plants and animals.  

“Our analysis highlights the need to continue sampling the oceans, focusing on areas that are under-studied, such as the deep sea and the ocean floor. Also, since the ocean is forever changing – both due to human activity and to natural processes – the catalog will need continual updating,” said Laiolo.  

Duarte cautions that despite its clear benefit, the future of the catalog is uncertain. A major obstacle is the status of international legislation on benefit-sharing from discoveries made in international waters.  

“While the 2023 Treaty of the High Seas offers some solutions, it may inadvertently impede research by reducing incentives for companies and governments to invest. Such uncertainty must be resolved now we have reached the point where genetic and artificial intelligence technologies could unlock unprecedented innovation and progress in blue biotechnology,” he concluded.

The article is part of a Frontiers in Science multimedia article hub featuring an explainer as well as an editorial, viewpoint, and policy outlook from other eminent experts: Prof Enric Sala (National Geographic Society, USA), Prof Andreas Teske (University of North Carolina at Chapel Hill, USA), and Peggy Rodgers Kalas (International Ocean Policy Advisor to the Oceano Azul Foundation, and former Director of the High Seas Alliance).  

 

Rice engineers propose hybrid urban water sourcing model


Reclaimed wastewater could make supply systems more resilient


Peer-Reviewed Publication

RICE UNIVERSITY

researchers 

IMAGE: 

LAUREN STADLER (FROM LEFT), QILIN LI AND LEONARDO DUEÑAS-OSORIO
 

view more 

CREDIT: CREDIT: RICE UNIVERSITY





HOUSTON – (Jan. 16, 2024) – Houston’s water and wastewater system could be more resilient with the development of hybrid urban water supply systems that combine conventional, centralized water sources with reclaimed wastewater, according to a study by Rice University engineers published in Nature Water.

“Such a system will save energy and help reduce the use of freshwater, a commodity that is becoming critically important around the world,” said Qilin Li, professor of civil and environmental engineering and co-director of the Nanosystems Engineering Research Center for Nanotechnology Enabled Water Treatment (NEWT) at Rice. “Our research shows that such a system is also more resilient against disruptive events such as hurricanes and flooding. It exhibits lower severity, range of impact and duration of substandard performance when disruption happens.”

Leonardo Dueñas-Osorio, a professor of civil and environmental engineering at Rice and a co-author on the study, said that hybrid water supply systems are more resilient than conventional centralized systems.

“Using Houston’s municipal water system as a case study, we explore the impact of various disruptions including pump station failures, pipe leakage and source water contamination,” Dueñas-Osorio said.

Li and her collaborators have identified vulnerable components in Houston’s existing water and wastewater system and proposed mitigation strategies. The urgency is compounded by the city’s aging infrastructure.

Nationally, the typical age of failing water mains is about 50 years, while the average age of the 1.6 million miles of water and sewer pipes in the U.S. is 45 years, according to the study. Some six billion gallons of treated water ⎯ roughly 15% of the U.S. daily public water supply ⎯ is lost each day from leaky pipes.

In addition, large cities around the world face unprecedented challenges as global climate change, population growth and continuing urbanization contribute to rapid increases in water demand, triggering water access issues and a growing financial burden due to water systems’ maintenance and upgrade needs.

On a small scale, cities around the world, including El Paso, have already enabled reclamation and reuse of municipal wastewater for both drinking and such non-potable uses as irrigation.

“Decreasing dependence on already stressed surface and groundwater resources has become increasingly important,” Li said. “An important challenge is to figure out how to best implement wastewater reclamation activities. The goal is to enhance sustainability and resiliency of big city water infrastructure. Our recent research shows the benefits of decentralized wastewater treatment and reuse.”

The study’s co-authors include Lauren Stadler, assistant professor of civil and environmental engineering at Rice; Rice doctoral alum Xiangnan Zhou; and Lu Liu, assistant professor of civil, construction and environmental engineering at Iowa State University.

The research was supported by the National Science Foundation (1707117, 1449500).

-30-

This release was authored by Patrick Kurp and can be found online at news.rice.edu.

Follow Rice News and Media Relations via Twitter @RiceUNews.

Publication details:

Hybrid wastewater treatment and reuse enhances urban water system resilience to disruptive incidents | Nature Water | DOI: 10.1038/s44221-023-00166-6

Authors: Lu Liu, Xiangnan Zhou, Leonardo Dueñas-Osorio, Lauren Stadler and Qilin Li

https://www.nature.com/articles/s44221-023-00166-6#citeas

Image downloads:

https://news-network.rice.edu/news/files/2024/01/IMG_0500-ab60ea3de300407f.jpg
CAPTION: Lauren Stadler (from left), Qilin Li and Leonardo Dueñas-Osorio (Credit: Rice University)

https://news-network.rice.edu/news/files/2024/01/IMG_0499-4a97dabed6e6558e.jpg
CAPTION: Lauren Stadler (from left), Qilin Li and Leonardo Dueñas-Osorio (Credit: Rice University)

Links:

NEWT: https://newtcenter.org/
Department of Civil and Environmental Engineering: https://cee.rice.edu/
George R. Brown School of Engineering: https://engineering.rice.edu/
Dueñas-Osorio Research Group: https://duenas-osorio.rice.edu/
Li Research Group: https://qilinli.rice.edu/

About Rice:

Located on a 300-acre forested campus in Houston, Rice University is consistently ranked among the nation’s top 20 universities by U.S. News & World Report. Rice has highly respected schools of architecture, business, continuing studies, engineering, humanities, music, natural sciences and social sciences and is home to the Baker Institute for Public Policy. With 4,574 undergraduates and 3,982 graduate students, Rice’s undergraduate student-to-faculty ratio is just under 6-to-1. Its residential college system builds close-knit communities and lifelong friendships, just one reason why Rice is ranked No. 1 for lots of race/class interaction, No. 2 for best-run colleges and No. 12 for quality of life by the Princeton Review. Rice is also rated as a best value among private universities by Kiplinger’s Personal Finance.

 BRAZIL

Brumadinho dam collapse: The danger emerged after the decommissioning



Peer-Reviewed Publication

ETH ZURICH

Mudslide 

IMAGE: 

THE MUDSLIDE CAME WITHOUT WARNING AND DESTROYED THE MINE SITE, NEIGHBOURING SETTLEMENTS AND A RAILWAY BRIDGE, KILLING 270 PEOPLE.

view more 

CREDIT: (PHOTOGRAPH: ERIC MARMOR / IDF SPOKESPERSON, CC BY-​SA 3.0, WIKIMEDIA COMMONS)




The disaster near the small town of Brumadinho in southeastern Brazil occurred shortly after midday: on 25 January 2019, at a nearby iron ore mine, the tailings dam – a storage area for the sludgy, fine-​grained residues from ore processing, or “tailings” – collapsed. A huge mudslide of around 10 million cubic metres of liquefied tailings flooded the site of the mine, destroying neighbouring settlements and taking a railway bridge with it. At least 270 people were killed. The ecosystem of the Paraopeba River downstream of the mine was ruined. Although the dam had a monitoring system, no one had foreseen the disaster.

The Brumadinho dam collapse resulted in several lawsuits against the Vale mining company and the TÜV Süd inspection organisation. Shortly before the accident, the latter had certified that the dam was sufficiently stable. Vale was ordered to pay the equivalent of around 6 billion euros in damages. An investigation committee concluded that the accident was caused by slowly accumulating microscopic displacements (known as creep) in the deposited tailings layers, but did not provide the exact physical mechanism supporting this hypothesis. In particular, uncertainty remained as to why the dam broke in 2019 specifically – three years after the pond was last loaded with new tailings, and why no significant displacements had been detected prior to the collapse.

Physical mechanism explained

A study by Professor Alexander Puzrin, Head of the Institute of Geotechnical Engineering at ETH Zurich and an expert in landslide modelling, has now shed light on the Brumadinho disaster. The paper appears in the journal Communications of Earth and Environment. The scientists used numerical and analytical models to investigate the causes of the dam failure, and they have identified a physical mechanism that explains the mining accident.

The tailings pond was built in 1976. Over the following years, the pond’s earth dam was raised several times by a few metres, as is customary in ore mining, to create additional space for the storage of processing residues. The dam’s steps were placed with an offset on top of each other, much like the steps of a staircase (upstream principle). In the end, the dam consisted of ten steps and was 86 metres tall. When the structure failed in January 2019, the initial rupture occurred at the second dam level. As a result, all ten steps of the earth dam collapsed and, together with the liquefied accumulated tailings, ran out down into the valley as a mudslide.

Creep deformation after decommissioning

The work done by Professor Puzrin’s team now shows how this could happen. According to the new findings, some initial slip surfaces had already appeared in the dammed tailings at the height of the second dam step during the dam’s construction. With the progress of construction, these slip surfaces increased in length but remained too short to cause a collapse. However, after the tailings dam was decommissioned in 2016 – suggests the ETH team’s modelling – these surfaces continued to expand horizontally and eventually reached a critical length. As a result, the layers of tailings began to move, causing the dam to burst under their weight and triggering the deadly mudslide.

According to the model, what caused the slip surface to grow is known as creep deformation. These deformations in the fine-​grained, brittle tailings are tiny, slowly accumulating displacements caused by uneven pressure distribution between the grains in the overlying deposits. “Other triggers such as rainfall and drilling can accelerate slip surface growth,” Puzrin says, “but our model shows that creep deformation alone is sufficient for the slip surface to reach the critical extent that can trigger a dam failure.”

The findings are worrying in two respects: the slip surface that caused the disaster apparently grew at a time when the pond was no longer being loaded with new tailings – in other words, without any additional external load; and the growth of the slip surface did not lead to any significant external deformation of the dam that the monitoring system could have recognised as alarming.

ETH model enables risk analysis

Tailings dams for processing residues from the mining of iron ore and other mineral rocks are used in large numbers worldwide. In each year since 2000, five to six cases have been recorded in which dams have failed or been damaged for various reasons. After the Brumadinho disaster and similar accidents, Brazil decommissioned tailings ponds with dams based on the upstream principle. However, the ETH study now shows that simply no longer loading a pond with new tailings may not avert the danger.

Dam breaches like the one in Brumadinho cannot yet be predicted using conventional monitoring systems. The ETH study opens up new options: “Our model can carry out a risk analysis for existing dams and predict the likelihood of a dam failure,” Puzrin says. If a high risk is identified, there are various plausible courses of action: The risk can be reduced by pumping the water out from the boreholes in the tailings ponds. Or the tailings dam can be dismantled. In urgent cases, endangered villages can be temporarily evacuated to protect the inhabitants until the danger has been averted.

Helping to make earth dams safer

The ETH study’s findings are relevant to all tailings dams for processing residues from ore mining. This is because whenever the residues consist of fine-​​grained, brittle material, in the worst case slip surfaces can form over which the deposited material can slide and damage the dam.

The situation is not directly comparable with reservoirs where water is retained by an earth dam. However, the new findings can also contribute to safety here, as Puzrin points out: “Our findings provide an indication of how to further improve the safety of earth dams in the event of an earthquake, which can generate an initial slip surface. In this respect, our work helps to make dams safer in general.”

During construction, short slip surfaces formed in the deposits of the tailings dam (above). These grew after decommissioning due to creep deformation (centre) and reached the critical length (bottom) that caused the collapse of the dam. 

CREDIT

(Graphic: IGT/ETH Zurich)