Monday, May 24, 2021

 

Researchers identify the causes of the extreme drought that affected the Pantanal

The study shows that the 2019-20 drought resulted from a natural meteorological phenomenon similar to the one that caused the 2014-16 critical water shortage in São Paulo state, Southeast Brazil.

FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULO

Research News

IMAGE

IMAGE: THE STUDY SHOWS THAT THE 2019-20 DROUGHT RESULTED FROM A NATURAL METEOROLOGICAL PHENOMENON SIMILAR TO THE ONE THAT CAUSED THE 2014-16 CRITICAL WATER SHORTAGE IN SÃO PAULO STATE, SOUTHEAST BRAZIL... view more 

CREDIT: JOSÉ MARENGO

The extreme drought suffered by the Pantanal in 2019-20, considered the worst in the last 50 years, was caused by natural climate conditions similar to those underlying the 2014-16 water crisis in São Paulo state. The Pantanal is one of the world’s largest wetlands. The Brazilian portion is located in the Center-West region, mainly Mato Grosso do Sul state.

The 2019-20 extreme drought was studied by researchers affiliated with the Natural Disaster Surveillance and Early Warning Center (CEMADEN), the National Space Research Institute (INPE) and São Paulo State University (UNESP) as part of a project supported by FAPESP under the aegis of its Research Program on Global Climate Change (RPGCC).

The results of the study are published in the journal Frontiers in Water.

“The recent drought in the Pantanal was caused by a meteorological phenomenon we call atmospheric blocking. A high-pressure area prevented the formation of rainclouds throughout the central-western portion of South America. Temperatures were very high and relative humidity very low,” José Marengo, a researcher at CEMADEN and principal investigator for the study, told Agência FAPESP.

“Lack of rain combined with high temperatures and very low humidity led to a heightened risk of fire, which extended to agricultural areas as well as natural parts of the biome.”

Deliberate burning of vegetation to clear land for cattle ranching contributed to the spread of wildfires throughout the region, and these were harder to control owing to the long period of drought. “Fires caused on one hand by warmer air and lack of rain in the Pantanal, and on the other by the burning of areas to clear the vegetation for cattle to graze, resulted in environmental disaster,” Marengo said.

Sources of observational data

To investigate the hydroclimatic causes of drought in the Pantanal, the researchers used an array of sources of observational hydrological data (on the level of the Paraguay River) as well as data on rainfall and hydroclimatic teleconnections (links between hydrological events and atmospheric circulation patterns causing weather phenomena). They also used information on land use and remote sensing data to characterize water stress and drought in the Pantanal.

Based on an analysis of all these datasets they were able to provide a clear description of the interannual variability of rainfall, river streamflow, and drought-related factors, concluding that the drought was caused by a complex combination of hydroclimatic teleconnections.

“The lack of rain during the summer in 2019 and 2020 in the Pantanal was due to a decrease in the flow of warm humid air from the Amazon,” Marengo said.

Part of the rain that falls on the Pantanal is brought by winds blowing from the North Atlantic to the Amazon, and from there to the Pantanal in Brazil’s Center-West region. The humidity coming down from the Amazon and cold fronts coming up from the South were prevented from reaching the Pantanal by a high-pressure bubble. As a result, masses of warmer and drier air contributed to the scarcity of rain during the summer at the peak of the monsoon season, typically characterized by changes in atmospheric circulation and precipitation in tropical and subtropical coastal regions due to asymmetric heating of land and sea.

The result was a prolonged extreme drought throughout the Pantanal, with severely adverse effects on the biome. “This phenomenon is natural and occurred in a similar manner in São Paulo during the 2014-16 drought,” Marengo said.

Amplified impact

According to Marengo, it is not yet possible to foresee whether the Pantanal will face more severe droughts in the years ahead. The problem can only be avoided if there is the right amount of rain at the right time. “It’s no use getting rain now, in March, at the end of the rainy season, and then between December this year and February 2022, for example,” he said. “This compromises the rainy season and increases the risk of more wildfires in the Pantanal.”

Drought in the Pantanal cannot be blamed solely on global climate change because it is a natural event and climate change is a long-term process. The recent drought differed from those seen in the 1950s and 1960s when the planet was cooler. “What’s happening now is that these natural droughts are suffering the effects of climate instability, and the effects are worse because back then there was a lot less human occupancy in the region. Its population is now more vulnerable to the impact of drought,” Marengo said.

The article “Extreme drought in the Brazilian Pantanal in 2019-2020: characterization, causes, and impacts” (doi: 10.3389/frwa.2021.639204) by Jose A. Marengo, Ana P. Cunha, Luz Adriana Cuartas, Karinne R. Deusdará Leal, Elisangela Broedel, Marcelo E. Seluchi, Camila Miranda Michelin, Cheila Flávia De Praga Baião, Eleazar Chuchón Ângulo, Elton K. Almeida, Marcos L. Kazmierczak, Nelson Pedro António Mateus, Rodrigo C. Silva and Fabiani Bender is at: www.frontiersin.org/articles/10.3389/frwa.2021.639204/full.

Pre-Columbus climate change may have caused Amazon population decline

Indigenous Amazonia populations may have been in decline prior to 'Great Dying'

UNIVERSITY OF READING

Research News

Climate change impacts felt in the Amazon rainforest prior to the arrival of European settlers after 1492 may have meant populations of indigenous people were already in decline before the 'Great Dying', new research has suggested.

Scientists studying fossil pollen and charcoal data from across the Amazon say it appears to show that human management of the rainforest may have peaked around 1200 AD, before some sites were abandoned, allowing reforestation of these areas.

The new research, involving University of Reading scientists and published in the journal Science, challenges the prior assumption that the largest population decrease in the Americas - known as the Great Dying - did not start until after European settlers carried new diseases to the continent.

Professor Frank Mayle, a tropical palaeoecology researcher at the University of Reading, and co-author of the study, said: "Our analysis raises the possibility that climate change caused the decline of some Amazonian societies several centuries before the Europeans arrived, especially the more complex societies which may have been too rigid to adapt.

"Although the introduction of European diseases, such as small pox, is still likely to have been the reason for the major population decline subsequently seen in the Americas, the research is a warning of the threat climate change poses to society. Knowledge of how different types of ancient society responded to past climate change may provide valuable clues to understanding the fate of today's diverse societies under 21st century global warming."

The research was led by Professor Mark Bush at Florida Tech, and included a team of international collaborators who are investigating how pre- and post-European people modified and managed Amazonian forests.

Analysis of fossilised pollen and charcoal revealed that many previously deforested lands have been recovering for over 800 years, rather than the 400 years previously supposed, indicating a pre-European population decline. The research team is now looking to assess the drivers and mechanisms of this population drop-off.

Finding signatures of initial forest regrowth following ancient human disturbance is important to ongoing discussions about the impact of Pre-Columbian people on Amazon rainforests and the extent to which modern forests exhibit legacies of past human activity.

This research also has implications for atmospheric and biosphere science. It was previously believed that the indigenous population collapse in Amazonia following European Contact, and subsequent reforestation, led to the sequestration of so much carbon dioxide that global atmospheric CO2 levels decreased markedly, an event known as the 'Orbis Spike'. Yet the team found no evidence that the Orbis Spike was caused by Amazonian reforestation.

###

The research was funded by grants from the National Geographic Society and the National Science Foundation in the United States and the European Research Council and the Natural Environment Research Council in the United Kingdom.

 Decolonizing ecology? How to adopt practices that make science more equitable

UNIVERSITY OF CAPE TOWN

Research News

IMAGE

IMAGE: FIVE INTERVENTIONS TO BUILD A MORE ANTI-OPPRESSIVE AND DECOLONIAL ECOLOGY. view more 

CREDIT: KEREN COOPER

Knowledge systems outside of those sanctioned by Western universities have often been marginalised or simply not engaged with in many science disciplines, but there are multiple examples where Western scientists have claimed discoveries for knowledge that resident experts already knew and shared. This demonstrates not a lack of knowledge itself but rather that, for many scientists raised in Western society, little education concerning histories of systemic oppression has been by design. Western scientific knowledge has also been used to justify social and environmental control, including dispossessing colonised people of their land and ways of life and discounting existing knowledge systems.

But how can those in the ecological discipline slowly begin to practise ecology in a more creative, reflective, equitable and inclusive way? According to a new paper published in the journal Nature Ecology & Evolution there are five interventions to build a more anti-oppressive and decolonial ecology:

  • Decolonise your mind to include multiple ways of knowing and communicating science;

  • Know your histories to acknowledge the role research has played in enabling colonial and ongoing violence against peoples and nature, and begin processes of restorative justice;

  • Decolonise access by going beyond Open Access journals and data repositories to address issues of data sovereignty and the power dynamics of research ownership;

  • Decolonise expertise by amplifying diverse expertise in ecologies from local experts and giving due credit and weight to that knowledge; and

  • Practice ethical ecology in inclusive teams by establishing diverse and inclusive research teams that actively deconstruct biases so all team members are empowered participants in developing new knowledge.

"These actions are not offered as a checklist capable of undoing unjust systems worldwide, nor to overshadow long histories of place-based anti-colonial and anti-racist struggle, but as connection points to action for practising ecologists," said Dr Chris Trisos, from the Africa Climate and Development Initiative based at the University of Cape Town and co-author of the paper Decoloniality and anti-oppressive practices for a more ethical ecology.

"Because settler-colonial processes have increased vulnerability of people and other species by displacing them into unfamiliar or lower quality landscapes, the concept of ecological vulnerability to environmental change intersects with environmental justice," he added.

Co-author Dr Jess Auerbach from the Department of Anthropology at North-West University shared that access to scholarly literature and data resources is a global issue. Data and research papers are often locked behind a paywall or housed in servers and museums in the Global North even when the data collected was from the Global South. "This makes it inaccessible to scholars from under-resourced institutions who are often compelled to use pirate websites to read scientific publications. Publishing only open access resources is part of the solution but the issues run much deeper and consideration must be given to where data repositories are held, who holds the right to this data and what is needed to access it," she said.

One area the researchers highlight is the use of English as the dominant form of knowledge communication in science which can lead to publication bias against non-native English-speaking scientists. When one reads, writes and thinks in English, it is easy to forget that for the majority of people ecological knowledge is produced and tested in other tongues. It is ironic that in many ecology departments, knowing Latin names of species is met with admiration, whereas speaking living languages of sites of data origination is a 'nice to-have' skill.

Ecological scholarship must thus develop methods to include multiple languages in evidence synthesis and could require that scholars gain fluency in relevant languages as an essential entry point for understanding rich bodies of local knowledge on ecosystems and cultivating a more inclusive way of knowing and studying ecology. More inclusive teams are also needed to lead these projects and actively deconstruct biases. Diverse teams that include and amplify the voices of indigenous communities result in more innovative and effective problem solving and richer datasets.


CAPTION

Map showing the minimum estimate for each country of the number of bird species for which the Latin binomial name is based on a European person. Hundreds of bird species have been named after European surnames, with most of these species occurring outside Europe in formerly colonized countries.

CREDIT

Keren Cooper

Co-author Associate Professor Madhu Katti from North Carolina State University in the USA shared an example: "The Amazon Conservation Team works with Indigenous communities in several South American countries in participatory projects to promote self-governance and biodiversity conservation. They have developed a methodology of collaborative cultural mapping by providing technology such as mobile phones and apps to Indigenous communities. The Kogi people, among the last surviving civilizations from the pre-Columbian period started using a mobile phone app to create geo-referenced maps of their land within the framework of their own cultural knowledge, resulting in a richer dataset than a parachuting Western ecologist or conservationist might be able to gather.

"Analysis of change in social-ecological systems must consider the impacts of colonial histories and offer solutions in a decolonial framework. More opportunities for historically marginalized groups to set research agendas is an important way of redressing ongoing power imbalances," he added.

###

Access the paper: Decoloniality and anti-oppressive practices for a more ethical ecology

To unpack colonial influence on ecology, researchers propose five strategies

NORTH CAROLINA STATE UNIVERSITY

Research News

Ecology, the field of biology devoted to the study of organisms and their natural environments, needs to account for the historical legacy of colonialism that has shaped people and the natural world, researchers argued in a new perspective in the journal Nature Ecology & Evolution.

To make ecology more inclusive of the world's diverse people and cultures living in diverse ecosystems, researchers from University of Cape Town, North West University in South Africa and North Carolina State University proposed five strategies to untangle the impacts of colonialism on research and thinking in the field today.

"There are significant biases in our understanding of ecology and ecosystems because of this colonial framework of thinking," said perspective co-author Madhusudan Katti, associate professor for leadership in public science, and forestry and environmental resources at NC State. "We are challenging ecologists to understand and address the legacies of colonialism, and to start engaging in an active process of 'decolonizing science.'"

The researchers described emerging research documenting impacts of European colonialism - the migration, settlement and exploitation of the Americas, Africa, Asia and other parts of the world by people from Europe - on people and the natural world, and on ecology.

Katti said examples of how ecological research reflects the impact of colonialism include patterns of vegetation in cities that reflect patterns of racial segregation and discrimination, or in the use of names of prominent European scientists or their patrons in the scientific names for bird species and other organisms rather than names used by Indigenous people.

"Indigenous names are often based on observations of behavior or ecology, or represent cultural significance of species, but that traditional ecological knowledge is lost when names are changed," Katti said. "This is bad for both the colonized people and the science of ecology itself."

The researchers challenged the field to address colonial legacies using five strategies:

* Decolonize the mind. Researchers said this should be done by understanding other knowledge systems from colonized cultures.

* Know your history, or understand the history of colonialism in influencing Western ecology, and its role in promoting oppression of other people and in shaping the environment. "Ecology is about organisms living in their ecosystems - that's what we study," Katti said. "If you want to study ecology, that includes people. To understand how people shape ecosystems, we have to understand how political power works. Western ecologists have to acknowledge how science has been aligned with colonial power, and how it has been used to perpetuate systems of oppression that continue to this day."

* Decolonize information. They suggest this should be done by increasing access to academic information, and understanding power dynamics in the way data is owned and disseminated.

* Decolonize expertise by recognizing more diverse voices in the field of ecology.

* Establish diverse and inclusive teams to help overcome biases in future research. "The world is enriched by diverse perspectives," Katti said. "We need scientific teams where everybody is equally empowered to set a robust research agenda, and ensure more robust testing of these ideas. If the person with a different hypothesis is not in the room, then you're never challenged to test and prove their hypothesis, and you're subject to your own bias."

###

The perspective, "Decoloniality and anti-oppressive practices for a more ethical ecology," was published online in Nature Ecology & Evolution on May 24, 2021. It was authored by Katti, Christopher H. Trisos and Jess Auerbach. Trisos acknowledges funding from the National Socio Environmental Synthesis Center under funding received from the National Science Foundation (DBI-1639145) and FLAIR fellowship programme - a partnership between the African Academy of Sciences and Royal Society, funded by the UK Government's Global Challenges Research Fund.

-oleniacz-

Note to editors: The abstract follows.

"Decoloniality and anti-oppressive practices for a more ethical ecology"

Authors: Christopher H. Trisos, Jess Auerbach and Madhusudan Katti.

Published online May 24 in Nature Ecology and Evolution.

Abstract: Ecological research and practice are crucial to understanding and guiding more positive relationships between people and ecosystems. However, ecology as a discipline and the diversity of those who call themselves ecologists have also been shaped and held back by often exclusionary Western approaches to knowing and doing ecology. To overcome these historical constraints and to make ecology inclusive of the diverse peoples inhabiting Earth's varied ecosystems, ecologists must expand their knowledge, both in theory and practice, to incorporate varied perspectives, approaches and interpretations from, with and within the natural environment and across global systems. We outline five shifts that could help to transform academic ecological practice: decolonize your mind; know your histories; decolonize access; decolonize expertise; and practise ethical ecology in inclusive teams. We challenge the discipline to become more inclusive, creative and ethical at a moment when the perils of entrenched thinking have never been clearer.

Evacuating under dire wildfire scenarios

UNIVERSITY OF UTAH

Research News

IMAGE

IMAGE: IN 2018, THE CAMP FIRE RIPPED THROUGH THE TOWN OF PARADISE, CALIFORNIA AT AN UNPRECEDENTED RATE. MUCH OF THE TOWN WAS DESTROYED IN THE TRAGEDY. view more 

CREDIT: THE WHITE HOUSE VIA WIKICOMMONS

In 2018, the Camp Fire ripped through the town of Paradise, California at an unprecedented rate. Officials had prepared an evacuation plan that required 3 hours to get residents to safety. The fire, bigger and faster than ever before, spread to the community in only 90 minutes.

As climate change intensifies, wildfires in the West are behaving in ways that were unimaginable in the past--and the common disaster response approaches are woefully unprepared for this new reality. In a recent study, a team of researchers led by the University of Utah proposed a framework for simulating dire scenarios, which the authors define as scenarios where there is less time to evacuate an area than is required. The paper, published on April 21, 2021 in the journal Natural Hazards Review, found that minimizing losses during dire scenarios involves elements that are not represented in current simulation models, among them improvisation and altruism.

"The world is dealing with situations that exceed our worst case scenarios," said lead author Thomas Cova, professor of geography at the U. "Basically we're calling for planning for the unprecedented, which is a tough thing to do."

Most emergency officials in fire-prone regions develop evacuation plans based on the assumptions that wildfires and residents will behave predictably based on past events. However, recent devastating wildfires in Oregon, California and other western states have shown that those assumptions may no longer hold.

"Wildfires are really becoming more unpredictable due to climate change. And from a psychological perspective, we have people in the same area being evacuated multiple times in the past 10 years. So, when evacuation orders come, people think, 'Well, nothing happened the last few times. I'm staying,'" said Frank Drews, professor of psychology at the U and co-author of the study. "Given the reality of climate change, it's important to critically assess where we are and say, 'Maybe we can't count on certain assumptions like we did in the past.'"



CAPTION

Dire evacuation scenario category based on a score.

CREDIT

Cova et. al. (2021) Nat Hazards Rev

How to predict the unprecedented

The framework allows planners to create a dire wildfire scenario--when the lead time, defined as the time before the fire reaches a community to respond, is less than the time required to evacuate. The authors developed a scoring system that categorizes each scenario as routine, dire, very dire or extremely dire based on many different factors.

One big factor affecting the direness is the ignition location, as one closer to a community offers less time than one farther away. A second major factor is the wildfire detection time. During the day, plumes of smoke can cue a quick response, but if it starts at night when everyone is asleep, it could take longer to get people moving. Officials may delay their decisions to avoid disrupting the community unnecessarily, but a last-minute evacuation order can cause traffic jams or put a strain on low-mobility households.

Alert system technologies can create dire circumstances if residents do not receive the warning in time due to poor cellphone coverage or low subscription rates to reverse 911 warning systems. If the community has many near misses with wildfire, the public's response could be to enact a wait-and-see approach before they leave their homes.

Using a dire scenario dashboard, the user assigns various factors an impediment level--low, minor or major--that can change at any point to lessen or increase a situation's direness.

"Usually when we run computer simulations, nothing ever goes wrong. But in the real world, things can get much worse half-way through an evacuation," said Cova. "So, what happens when you don't have enough time? The objective switches from getting everyone out to instead minimizing casualties. It's dark."

"More people began working remotely from home during the pandemic, which then led to them moving out of large cities into rural areas," explained assistant professor Dapeng Li of the South Dakota State University Department of Geography and Geospatial Sciences, a co-author and U alumnus who helped develop the computer simulations. "These rural communities typically have fewer resources and face challenges in rapidly evacuating a larger number of residents in this type of emergency situation."


CAPTION

Anatomy of a dire scenario due to sudden increase of fire spread rate.

CREDIT

Cova et. al. (2021) Nat Hazards Rev

Reducing dire scenarios

Simulating dire wildfire scenarios can improve planning and the outcomes in cases where everything goes wrong. For example, creating fire shelters and safety zones inside of a community can protect residents who can't get out, while reducing traffic congestion for others who can evacuate. During the 2018 Camp Fire, people improvised temporary refuges in parking lots and community buildings. Modeling could help city planners construct permanent safety areas ahead of time.

A common human response during wildfires are improvisations and creative thinking, which are difficult to model but can be literally lifesaving. For example, during the 2020 Creek Fire in California, a nearby military base sent a helicopter to rescue trapped campers. Another crucial component is individuals helping others, such as people giving others rides or warning neighbors who missed the official alert. During the Camp Fire, Joe Kennedy used his bulldozer to singlehandedly clear abandoned cars that were blocking traffic.

"It is very common for families and neighbors to assume a first responder role and help each other during disasters," said Laura Siebeneck, associate professor of emergency management and disaster science at the University of North Texas and co-author of the study. "Many times, individuals and groups come together, cooperate, and improvise solutions as needed. Though it is difficult to capture improvisation and altruism in the modeling environment, better understanding human behavior during dire events can potentially lead to better protective actions and preparedness to dire wildfire events."

Studying and modeling dire scenarios is necessary to improve the outcomes of unprecedented changes in fire occurrence and behavior. This study is the first attempt to develop a simulation framework for these scenarios, and more research is needed to incorporate the unpredictable elements that create increasingly catastrophic wildfires.

###

MOONSHINE

Corn ethanol reduces carbon footprint, greenhouse gases

DOE/ARGONNE NATIONAL LABORATORY

Research News

A study conducted by researchers at the U.S. Department of Energy's (DOE) Argonne National Laboratory reveals that the use of corn ethanol is reducing the carbon footprint and diminishing greenhouse gases.

The study, recently published in Biofuels, Bioproducts and Biorefining, analyzes corn ethanol production in the United States from 2005 to 2019, when production more than quadrupled. Scientists assessed corn ethanol's greenhouse gas (GHG) emission intensity (sometimes known as carbon intensity, or CI) during that period and found a 23% reduction in CI.

According to Argonne scientists, corn ethanol production increased over the period, from 1.6 to 15 billion gallons (6.1 to 57 billion liters). Supportive biofuel policies -- such as the Environmental Protection Agency's Renewable Fuel Standard and California's Low-Carbon Fuel Standard -- helped generate the increase. Both of those federal and state programs evaluate the life-cycle GHG emissions of fuel production pathways to calculate the benefits of using renewable fuels.

To assess emissions, scientists use a process called life-cycle analysis, or LCA -- the standard method for comparing relative GHG emission impacts among different fuel production pathways.

"Since the late 1990s, LCA studies have demonstrated the GHG emission reduction benefits of corn ethanol as a gasoline alternative," noted Argonne senior scientist Michael Wang, who leads the Systems Assessment Center in the laboratory's Energy Systems division and is one of the study's principal investigators. "This new study shows the continuous downtrend of corn ethanol GHG emissions."

"The corn ethanol production pathway -- both in terms of corn farming and biorefineries -- has evolved greatly since 2005," observed Argonne analyst Uisung Lee, first author of the study. Lee pointed out that the study relied on comprehensive statistics of corn farming from the U.S. Department of Agriculture and of corn ethanol production from industry benchmark data.

Hoyoung Kwon, a coauthor, stated that U.S. corn grain yields improved by 15%, reaching 168 bushels per acre despite fertilizer inputs remaining constant and resulting in a decreased intensity in fertilizer input per bushel of corn harvested: reductions of 7% in nitrogen use and 18% in potash use.

May Wu, another co-author, added that ethanol yields increased 6.5%, with a 24% reduction in ethanol plant energy use.

"With the increased total volume and the reduced CI values of corn ethanol between 2005 and 2019, corn ethanol has resulted in a total GHG reduction of more than 500 million tons between 2005 and 2019," Wang emphasized. "For the United States, biofuels like corn ethanol can play a critical role in reducing our carbon footprint."

The Argonne team used Argonne's GREET® model for this study. Argonne developed GREET (the Greenhouse gases, Regulated Emissions, and Energy use in Technologies) model, a one-of-a-kind LCA analytical tool that simulates the energy use and emissions output of various vehicle and fuel combinations. Government, industry, and other researchers worldwide use GREET® for LCA modeling of corn ethanol and other biofuels.

###

The work is funded by DOE's Vehicle Technologies Office in the Office of Energy Efficiency and Renewable Energy.

The Office of Energy Efficiency and Renewable Energy supports early-stage research and development of energy efficiency and renewable energy technologies to strengthen U.S. economic growth, energy security, and environmental quality.

Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation's first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America's scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy's Office of Science.

The U.S. Department of Energy's Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.

New study shines light on hazards of Earth's largest volcano

Researchers find that a large earthquake could set off eruption of Hawaii's Mauna Loa volcano

UNIVERSITY OF MIAMI ROSENSTIEL SCHOOL OF MARINE & ATMOSPHERIC SCIENCE

Research News

IMAGE

IMAGE: STANDING 9 KILOMETERS TALL FROM THE BASE ON THE SEAFLOOR TO THE SUMMIT, MAUNA LOA IS THE LARGEST VOLCANO ON EARTH. view more 

CREDIT: USGS

MIAMI - Scientists from the University of Miami (UM) Rosenstiel School of Marine and Atmospheric Science analyzed ground movements measured by Interferometric Synthetic Aperture Radar (InSAR) satellite data and GPS stations to precisely model where magma intruded and how magma influx changed over time, as well as where faults under the flanks moved without generating significant earthquakes. The GPS network is operated by the U.S. Geological Survey's Hawaii Volcano Observatory.

"An earthquake of magnitude-6 or greater would relieve the stress imparted by the influx of magma along a sub-horizontal fault under the western flank of the volcano," said Bhuvan Varugu, a Ph.D. candidate at the UM Rosenstiel School and lead author of the study. "This earthquake could trigger an eruption."

The researchers found that during 2014-2020 a total of 0.11 kilometers3 of new magma intruded into a dike-like magma body located under and south of the summit caldera, with the upper edge at 2.5 - 3 kilometers depth beneath the summit. They were able to determine that in 2015 the magma began expanding southward, where the topographic elevation is lower and the magma had less work to do against the topographic pressure. After the magma flux waned in 2017, the inflation center returned to its previous 2014-2015 horizontal position. Such changes of a magma body have never been observed before.

"At Mauna Loa, flank motion and eruptions are inherently related," said Varugu. "The influx of new magma started in 2014 after more than four years of seaward motion of the eastern flank - which opened up space in the rift zone for the magma to intrude."

The researchers also found that there was movement not associated with an earthquake along a near-horizontal fault under the eastern flank, however, no movement was detected under the western flank. This led the researchers to conclude that an earthquake under the western flank is due. Motions along near-horizontal faults under the flanks are essential features of long-term volcano growth.

Will the volcano erupt in the near future? "If magma influx continues it is likely, but not required," says Varugu. "The topographic load is pretty heavy, the magma could also propagate laterally through the rift zone".

"An earthquake could be a game changer," said Falk Amelung, a professor at the UM Rosenstiel School's Department of Marine Geosciences and senior author of the study. "It would release gases from the magma comparable to shaking a soda bottle, generating additional pressure and buoyancy, sufficient to break the rock above the magma."

According to the researchers there are many uncertainties. Though the stress that was exerted along the fault is known, the magnitude of the earthquake will also depend on the size of the fault patch that will actually rupture. Additionally, there are no satellite data available to determine movements prior to 2002.

"It is a fascinating problem," said Amelung, "We can explain how and why the magma body changed during the past six years. We will continue observing and this will eventually lead to better models to forecast the next eruption site."

Standing 9 kilometers tall from the base on the seafloor to the summit, Mauna Loa is the largest volcano on Earth. In the 1950 eruption, it took only three hours for the lava to reach the Kona coast. Such rapid flows would leave very little time to evacuate people in the path of its lava. Another large eruption of Mauna Loa occurred in 1984.

The combination of earthquakes and eruptions is nothing unusual. The 1950 eruption was preceded by a magnitude 6.3 earthquake three days prior, and was followed by a magnitude 6.9 earthquake more than a year later. The 1984 eruption was preceded by a magnitude 6.6 earthquake 5 months prior.

The satellite data were acquired by the Italian Cosmo-Skymed satellites in the framework of the Geohazard Supersites and Natural Laboratories (GSNL) initiative of the Group on Earth Observation (GEO), an international umbrella organization to enhance the use of Earth Observation for societal benefits. Several space agencies pool their satellite resources to enable new studies of hazardous volcanoes. Other volcano supersites include the Icelandic, Ecuadorian and New Zealand volcanoes as well as Italy's Mt. Etna.

###


 

RMRS scientists recommend approach to adapt to uncertainty in wildland management

USDA FOREST SERVICE - ROCKY MOUNTAIN RESEARCH STATION

Research News

MISSOULA, Mont., May 24, 2021 -- Scientists from the Rocky Mountain Research Station collaborated to explore how research and management can confront increasing uncertainty due to climate change, invasive species, and land use conversion.

Wildland management and policy have long depended on the idea that ecosystems are fundamentally static, and periodic events like droughts are just temporary detours from a larger, stable equilibrium. However, ecosystems are currently changing at unprecedented rates. For example, bark beetle infestations, droughts, and severe wildfires have killed large numbers of trees across the western United States. In many cases, these changes may be irreversible.

In new research published in Frontiers in Forests and Global Change, Dr. Kevin McKelvey and colleagues from several Rocky Mountain Research Station science programs suggest ways for managers to respond. As ecosystems change in increasingly unpredictable ways, we will need more flexible and adaptive approaches to manage them. Rather than relying on knowledge of what ecosystems once looked like, we will need to learn and adjust to new conditions quickly. To achieve that goal, the authors recommend a more inclusive and collaborative governance model that would increase public and stakeholder participation, integrate research and management, and incorporate multiple forms of knowledge, including from indigenous communities. Such an approach, they argue, will encourage collaborative learning, increase trust in management, and allow for more efficient responses to change.

"Because all paths forward are fraught with uncertainties, limitations, and the likelihood that plans will fail due to unforeseen events, we need a much broader public not only involved in the decision process, but additionally to understand the limits both to knowledge and to achievable actions," said McKelvey.

The authors also explore priorities for research under this new model. For example, they note that rapidly changing ecosystems will increase our dependency on predictive modeling - and therefore will require better models. As a result, research should focus on collecting the types of data that support model development and validation.

Finally, the scientists offer a set of concrete recommendations to pivot toward accepting uncertainty - from conducting landscape-level assessments to focusing on retaining species that are resilient to disturbances.

"The implications for management and particularly for planning are profound," said McKelvey. "We need to vastly accelerate the planning process to keep pace with rapidly changing landscapes. We need much more local flexibility to find out what works and what doesn't. And we need to change the process for data collection and analyses - in many landscapes, 5-year-old data and analyses are already obsolete."

###

90% SUCCESS RATE
The Costly Success of Israel’s Iron Dome

The country’s missile-defense system tells a national story.


ADAM MAIDA / THE ATLANTIC / ABIR SULTAN / GETTY


In the 12 days that preceded Thursday’s announcement of a cease-fire, the Palestinian militant groups Hamas and Islamic Jihad launched 4,369 rockets of various sizes and ranges from Gaza toward Israel. According to Israel’s military, nearly two-thirds of these missed their target, hitting fields and other open areas, or malfunctioning and falling short. That still leaves about 1,500 rockets that headed for built-up areas. Remarkably, this barrage resulted in only a dozen deaths: More than 90 percent of the rockets were intercepted by Israel’s missile-defense system, Iron Dome.

If you’ve been watching coverage of the latest round of fighting in Gaza and Israel, you won’t have escaped the Iron Dome pyrotechnic display, astonishing especially at night as the rockets arching northward from Gaza are picked out of the sky in a litany of mid-air explosions. When it was first established more than a decade ago, Iron Dome had its skeptics, both in Israel and abroad, but over time, they—and the world—have seen it work. Literally.

It is a system that was designed for the challenge facing Israel—specifically, organizations on its borders, such as Hamas in Gaza and Hezbollah in Lebanon, that do not have the personnel or firepower to invade and challenge Israel’s army, but that have accumulated large arsenals of rockets that, although rudimentary and inaccurate, can target most of a small country like Israel. Each Iron Dome battery protects a relatively small parcel of territory, but Israel now has sufficient mobile batteries to protect the areas that are threatened at times of tension.


RECOMMENDED READING

A New Word Is Defining the Israeli-Palestinian Conflict in WashingtonYASMEEN SERHAN


No One Is Coming to Help the PalestiniansKIM GHATTAS


Don’t Take the Narrow View of What’s Happening in GazaSHADI HAMID

This architecture is, however, just one of the ways in which Iron Dome is unique. In fact, its very strengths and weaknesses reflect those of the country that developed it, epitomizing Israel’s interminable conflict with the Palestinians.

“On the one hand, Iron Dome is the perfect example of Israeli ingenuity and improvisation,” the journalist Yaakov Katz, who co-wrote The Weapon Wizards, a book about Israel’s arms industry, told me. “But its very success is a reflection of Israel’s biggest problem. Iron Dome allows you to almost ignore the fact that you have a neighbor just across the border with thousands of rockets pointed at you, because they can no longer really harm you. Iron Dome allows you not to find deeper solutions for that problem. And that’s very Israeli as well.”

Read: Bibi was right

Iron dome is incredibly popular among Israelis, and understandably so. Although Israel suffered a dozen fatalities during this month’s fighting, more than 240 Palestinians died. That discrepancy, largely due to the effectiveness of Iron Dome, also bears itself out in physical damage to homes, buildings, and infrastructure more broadly. Even during an intense conflict such as this one, the missile-defense system provides a sense of security.

But it also means many Israelis do not feel the urgency, or sufficient enough optimism, to press their leaders to solve the underlying problems causing the long-term crisis facing Gaza, where 2 million people live in a fetid, crowded coastal strip, under near-total blockade by Israel and Egypt since Hamas took over in 2007. Nor do many feel the need to address the wider historic conflict with the Palestinians that has been going on since before Israel’s founding in 1948. According to the pollster Dahlia Scheindlin, Israelis rank security first on their list of priorities, followed by financial concerns; resolving the conflict with the Palestinians typically ranks fifth or sixth, and is seen by Israelis as separate from the feeling of security. “You’ve got to ask yourself,” Scheindlin told me, if Israelis focus on security as defined by a piece of military hardware rather than on the core problem itself, “isn’t that a false sense of security?”

Much of what provides that sense of security is the visible deterrent that Iron Dome offers, cutting off rockets in the sky. What Israelis don’t see is the true heart of the system—not the interceptor missiles or the mobile batteries, but the mathematics. The algorithm that has been coded into the system, and that is constantly being improved upon, enables Iron Dome’s control center to track and predict the trajectories of incoming missiles, working out where they can be expected to fall, and issuing interception orders only if the point of impact is a built-up area, so as not to waste expensive interceptor rockets on harmless projectiles.

This level of calculation is also often attributed to Israel’s leader, Prime Minister Benjamin Netanyahu, who has been in office for the past 12 years. Netanyahu—much like Iron Dome itself—has been able to mask smaller failings in a greater success; he has, like the missile-defense system, warped the notion of how much time he and his country have to respond to threats; and he has, similar to Iron Dome, used technology to hide deep, structural societal flaws.

Take his response to the coronavirus pandemic. The whole world by now is aware that Israel was the fastest country to roll out COVID-19 vaccines and to have a majority of its population vaccinated. The program has been a tremendous success, and Netanyahu has sought to claim the political credit. What’s less known, or at least overlooked, is that before vaccines became available, he presided over a shambolic set of coronavirus policies. For long periods in 2020, Israel had the highest per capita rate of new reported infections in the world; only the comparatively low median age of its population kept the death toll down.

There are multiple reasons for Netanyahu’s COVID-19 failings. Because of political pressure, from both the Trump administration and special-interest groups within Israel, he was slow to shut off air travel to and from the United States, the source of most of his country’s early COVID-19 cases. And he refused to force Israel’s ultra-Orthodox community, his key political allies, who live in de facto autonomy within Israel, to abide by lockdown rules, allowing the virus to run rampant in their schools and synagogues.

And similar to how Iron Dome has changed how Israelis see time—in terms of how much of it they have to respond when projectiles are fired—Netanyahu has changed how they view time when it comes to the long-term prospects for resolving the conflict with the Palestinians. For decades, politicians and pundits at home and abroad have warned that Israel is running out of time to resolve the conflict—that international condemnation, pressure, and even boycotts and sanctions would isolate it; and that internally, it could not deal with the challenges of Palestinian population growth and resistance.

Read: Netanyahu brought nationalism to the 21st century

Netanyahu has insisted the opposite: that if Israel remains steadfast, the world, including Arab states, will give up on the Palestinian cause. He wrote in his 1993 book, A Place Among the Nations, that “for the foreseeable future the only kind of peace that will endure in the region between Arab and Arab and between Arab and Jew is the peace of deterrence,” and that such a strategy would have to suffice until the Arab world realized that it “stands to gain as much from making peace with Israel as Israel stands to gain from making peace with the Arabs.” Iron Dome is one of his tools for keeping the peace of deterrence and making time work in Israel’s favor.

And then there is Israel's—and Netanyahu’s—dependence on technology to make up for more intrinsic flaws. When early COVID-19 vaccines were about to be authorized by the United States, he bombarded Pfizer’s chief executive with dozens of phone calls to secure early shipments for Israel. Here, as with Iron Dome, Israel’s high-tech prowess came to his aid: Israel’s public health-care providers, which were to be in charge of administering the vaccine, have advanced digital medical records, and Netanyahu was able to offer Pfizer real-time data on how the vaccine was working in return for early shipments. In his office in Jerusalem, he now has two glass cases: In one is a model of an Iron Dome “Tamir” interceptor missile; in the other is the syringe that was used to inoculate him.

Yet behind this technological marvel is creaking national infrastructure and failing social services, for both Jewish and Arab citizens. That’s why, when the few rockets from Gaza did get through the Iron Dome shield this month, those who were killed were in nearly all cases old, disabled, poor, homeless, or residents of Arab villages without government services and therefore no bomb shelter. And while Israel’s air force simultaneously operated Iron Dome and kept up a steady rate of air strikes in Gaza throughout the recent campaign, within Israel’s cities there were insufficient police to deal with the riots that broke out between Arabs and Jews. Here we see another structural flaw in the Israeli state that Netanyahu has neglected.

With its remarkable success rate, Iron Dome is as close as possible to being the perfect defense system. It illustrates Israel’s remarkable technological prowess and the country's unwavering focus on the defense of its citizens. But Iron Dome's tremendous capabilities paper over more fundamental challenges—ones that Israel’s leader seems unwilling to resolve.

ANSHEL PFEFFER is a journalist based in Jerusalem for Ha’aretz. He is the author of Bibi: The Turbulent Life and Times of Benjamin Netanyahu.
A Clue to Why the 1918 Pandemic Came Back Stronger Than Before

Three 103-year-old-lung samples hinted at how the flu mutated to become more deadly.

A converted warehouse that was used to isolate 1918 flu patients
UNIVERSAL HISTORY ARCHIVE / UNIVERSAL IMAGES GROUP / GETTY

The three teenagers—two boys and a girl—could not have known what clues their lungs would one day yield. All they could have known, or felt, before they died in Germany in 1918 was their flu-ravaged lungs failing them, each breath getting harder and harder. Tens of millions of people like them died in the flu pandemic of 1918; they happened to be three whose lungs were preserved by a farsighted pathologist.

A century later, scientists have now managed to sequence flu viruses from pea-size samples of the three preserved lungs. Together, these sequences suggest an answer to one of the pandemic’s most enduring mysteries: Why was the second wave, in late 1918, so much deadlier than the first wave, in the spring? These rediscovered lung samples hint at the possibility that the virus itself changed to better infect humans.

This might sound familiar. The no-longer-so-novel coronavirus is also adapting to its human host. With modern tools, scientists are tracking the virus’s evolution in real time and finding mutations that have made the virus better at infecting us. More than 1.4 million coronavirus genomes have now been sequenced. But the database for the 1918 flu is much smaller—so much so that the comparison feels unfair. This new study brings the number of complete 1918 flu genomes to a grand total of three, plus some partial genomes.

Hundred-year-old lung tissue is incredibly hard to find. Sébastien Calvignac-Spencer, a virologist at the Robert Koch Institute, in Berlin, came across the samples in this newest study in a stroke of luck. A couple of years ago, he decided to investigate the collections of the Berlin Museum of Medical History of the Charité. He wasn’t looking for anything in particular, but he soon stumbled upon several lung specimens from 1918, a year he of course recognized as a notable one for respiratory disease. Despite the flu pandemic’s notoriety, the virus that caused it is still poorly understood. “I thought, Well, okay, so it’s right here in front of you. Why don’t you give it a try?” he told me. Why not try to sequence influenza from these lungs? (This work is not dangerous: The chemically preserved lung specimens do not contain intact or infectious virus; sequencing picks up just fragments of the virus’s genetic material.)

Calvignac-Spencer and his colleagues ultimately tested 13 lung specimens and found evidence of flu in three. One was from a 17-year-old girl who died in Munich sometime in 1918. The two others were from teenage soldiers who both died in Berlin on June 27, 1918. This work is described in a new preprint, which has not yet been peer-reviewed.

The team was able to recover a complete flu-virus genome from the 17-year-old girl’s lung tissue—only the third ever found. The two other full 1918 flu genomes both came from the United States, from the lungs of a woman buried in Alaska and from a paraffin-wax-embedded lung sample of a soldier who died in New York. With another genome in hand, the researchers moved to investigate how they differed. Several changes showed up in the flu’s genome-replication machinery, a potential evolutionary hot spot because better replication means a more successful virus. The team then copied just the replication machinery of the 17-year-old’s virus—not the entire virus—into cells and found it was only half as active as that of the flu virus found in Alaska.

The obvious caveats should apply here: tiny sample size, the limits of extrapolating from test tube to human body. The exact date of the girl’s death in 1918 is also unknown, but this finding hints at the possibility that the virus’s behavior did change during the pandemic. Scientists have long speculated about why the 1918 pandemic’s second wave was deadlier than the first. Patterns of human behavior and seasonality could explain some of the difference—but the virus itself might have changed too. “And this starts to put some meat on the bone” of that hypothesis, Andrew Mehle, an influenza researcher at the University of Wisconsin at Madison, who was not involved in the study, told me.

The lungs of the two young soldiers in Berlin provide another clue. The teenagers’ June 1918 deaths were squarely in the pandemic’s first wave. These two samples yielded only partial genomes, but the team was able to reconstruct enough to home in on changes in nucleoprotein, one of the proteins that make up the virus’s replication machinery. Nucleoproteins act like scaffolds for the virus’s gene segments, which wind around the protein like a spiral staircase. They are also extremely distinctive, which can be a weakness: The human immune system is very good at recognizing and sabotaging them.

Indeed, the 1918 flu virus’s nucleoprotein seems to have mutated between the first and second waves to better evade the human immune system. The first-wave viruses’ nucleoproteins looked a bit like those in flu viruses that infect birds—which makes sense because scientists suspect that the 1918 flu originated in birds. But bird viruses are attuned to bird bodies. “When it jumps to humans, the virus is not evolved to be optimally resistant” to the human immune system, Jesse Bloom, a virologist at Fred Hutchinson Cancer Research Center, in Seattle, told me. Bloom and others have identified specific mutations that make the nucleoprotein better at resisting the human immune system. The first-wave flu viruses did not have them, but the second-wave ones did, possibly because they had had the time to adapt to infecting humans.

This mutation-by-mutation analysis of the 1918 flu virus would have been impossible to imagine at the time of the pandemic. Doctors then hadn’t even figured out that influenza was caused by a virus. “There’s no way the individual who saved these samples in 1918 had any idea of what could be done to them,” Mehle said. “To my mind, this is a beautiful example of fundamental research.” Without the pathologist who painstakingly preserved these samples and the museums that kept them for decades before science caught up, our understanding of the 1918 flu would be all the poorer.

Unfortunately, many historical samples have been lost as pathology collections have fallen out of fashion over the past century. “If we had started these kinds of studies in the ’60s, we would have had no problems finding thousands and thousands of specimens,” Calvignac-Spencer said. “And now we’re really fighting to assemble a collection of 20.” He’s been in touch with more than 50 museum collections around the world in the hunt for more pandemic-flu samples. He recently found one from Australia, but the work is slow. Calvignac-Spencer has also looked for other viruses, including measles, which he and his colleagues previously found in a 100-year-old lung from the same medical collection in Berlin.

The further back in time researchers must go, the harder the samples are to find—but Bloom told me he’s especially intrigued by the possibility of finding pre-1918 flu genomes in the archives. When the 1918 pandemic swept through the world, it apparently completely replaced whatever flu existed before. Its modern-day descendants continue to infect us today as seasonal flu. In this way, the 1918 flu is familiar to us and our immune systems. What came before is still a mystery.