Friday, April 15, 2022

Global decline in nitrogen availability has consequences for many natural ecosystems


Peer-Reviewed Publication

AMERICAN ASSOCIATION FOR THE ADVANCEMENT OF SCIENCE (AAAS)

Declining nitrogen availability in many terrestrial ecosystems has widespread consequences for biodiversity and ecosystem functioning worldwide. In a Review, Rachel Mason and colleagues discuss the extent and consequences of nitrogen (N) decline, the human factors potentially driving it, and what might be done to help mitigate its damaging effects. “Akin to trends in atmospheric [carbon dioxide] and global temperatures, large-scale declines in N availability are likely to present long-term challenges that will require informed management and policy actions in the coming decade,” write Mason et al. Nitrogen is essential to life on Earth – it’s a key component of the plant proteins required to support the growth of plants and the animals that feed on them. Therefore, N availability in the environment has a strong influence on the structure and function of many ecosystems. While humans have more than doubled the global supply of reactive N through industrial and agricultural activities over the last century, much of this input has occurred in urban and agrarian areas. For areas not subject to anthropogenic N enrichment, however, a growing body of research suggests that N availability is declining in many terrestrial ecosystems worldwide. According to the authors, multiple environmental changes may be driving these declines, including elevated atmospheric carbon dioxide, rising global temperatures, and altered precipitation and ecosystem disturbance regimes. Reduced N availability can lead to changes in primary productivity in these ecosystems, impacting the diets of herbivores like insects, whose survival can have farther reaching impacts at higher trophic levels. Mason et al. highlight several ways that N decline can be monitored and mitigated but note that continued research is needed to inform management actions. “Given the potential implications of declining N availability for food webs, carbon sequestration, and other ecosystem functions and services, it is important that research, management, and policy actions be taken before the consequences of declining N availability become more severe,” write Mason et al.

Researchers find declining nitrogen availability in a nitrogen rich world


Peer-Reviewed Publication

UNIVERSITY OF MARYLAND CENTER FOR ENVIRONMENTAL SCIENCE

ANNAPOLIS, MD (April 15, 2022)—Since the mid-20th century, research and discussion has focused on the negative effects of excess nitrogen on terrestrial and aquatic ecosystems. However, new evidence indicates that the world is now experiencing a dual trajectory in nitrogen availability with many areas experiencing a hockey-stick shaped decline in the availability of nitrogen. In a new review paper in the journal Science, researchers have described the causes for these declines and the consequences on how ecosystems function.

“There is both too much nitrogen and too little nitrogen on Earth at the same time,” said Rachel Mason, lead author on the paper and former postdoctoral scholar at the National Socio-environmental Synthesis Center.

Over the last century, humans have more than doubled the total global supply of reactive nitrogen through industrial and agricultural activities. This nitrogen becomes concentrated in streams, inland lakes, and coastal bodies of water, sometimes resulting in eutrophication, low-oxygen dead-zones, and harmful algal blooms. These negative impacts of excess nitrogen have led scientists to study nitrogen as a pollutant. However, rising carbon dioxide and other global changes have increased demand for nitrogen by plants and microbes. In many areas of the world that are not subject to excessive inputs of nitrogen from people, long-term records demonstrate that nitrogen availability is declining, with important consequences for plant and animal growth.

Nitrogen is an essential element in proteins and as such its availability is critical to the growth of plants and the animals that eat them. Gardens, forests, and fisheries are almost all more productive when they are fertilized with moderate amounts of nitrogen. If plant nitrogen becomes less available, plants grow more slowly and their leaves are less nutritious to insects, potentially reducing growth and reproduction, not only of insects, but also the birds and bats that feed on them.

 

“When nitrogen is less available, every living thing holds on to the element for longer, slowing the flow of nitrogen from one organism to another through the food chain. This is why we can say that the nitrogen cycle is slowing down,” said Andrew Elmore, senior author on the paper  and a professor of landscape ecology at the University of Maryland Center for Environmental Science and at the National Socio-environmental Synthesis Center.

Researchers reviewed long-term, global and regional studies and found evidence of declining nitrogen availability. For example, grasslands in central North America have been experiencing declining nitrogen availability for a hundred years, and cattle grazing these areas have had less protein in their diets over time. Meanwhile, many forests in North American and Europe have been experiencing nutritional declines for several decades or longer.

These declines are likely caused by multiple environmental changes, one being elevated atmospheric carbon dioxide levels. Atmospheric carbon dioxide has reached its highest level in millions of years, and terrestrial plants are exposed to about 50% more of this essential resource than just 150 years ago. Elevated atmospheric carbon dioxide fertilizes plants, allowing faster growth, but diluting plant nitrogen in the process, leading to a cascade of effects that lower the availability of nitrogen. On top of increasing atmospheric carbon dioxide, warming and disturbances, including wildfire, can also reduce availability over time.

Declining nitrogen availability is also likely constraining the ability of plants to remove carbon dioxide from the atmosphere. Currently global plant biomass stores nearly as much carbon as is contained in the atmosphere, and biomass carbon storage increases each year as carbon dioxide levels increase. However, declining nitrogen availability jeopardizes the annual increase in plant carbon storage by imposing limitations to plant growth. Therefore, climate change models that currently attempt to estimate carbon stored in biomass, including trends over time, need to account for nitrogen availability.

“The strong indications of declining nitrogen availability in many places and contexts is another important reason to rapidly reduce our reliance on fossil fuels,” said Elmore. “Additional management responses that could increase nitrogen availability over large regions are likely to be controversial, but are clearly an important area to be studied.”

In the meantime, the review paper recommends that data need be assembled into an annual state-of-the-nitrogen-cycle report, or a global map of changing nitrogen availability, that would represent a comprehensive resource for scientists, managers, and policy-makers.

"Evidence, Causes, and Consequences of Declining Nitrogen Availability in Terrestrial Ecosystems” was published in Science.

UNIVERSITY OF MARYLAND CENTER FOR ENVIRONMENTAL SCIENCE
The University of Maryland Center for Environmental Science (UMCES) is a leading research and educational institution working to understand and manage the world’s resources. From a network of laboratories spanning from the Allegheny Mountains to the Atlantic Ocean, UMCES scientists provide sound advice to help state and national leaders manage the environment and prepare future scientists to meet the global challenges of the 21st century.

# # #

A multi-institutional research team finds declining nitrogen availability in a nitrogen-rich world

Factoring this deficit into climate change models is critical to achieving accurate carbon sink capacity estimates

Peer-Reviewed Publication

ADVANCED SCIENCE RESEARCH CENTER, GC/CUNY

Nitrogene Deficiency 

IMAGE: CHANGES IN THE NITROGEN CYCLE CAN BE DETECTED BY MONITORING ECOSYSTEM NITROGEN INPUTS, INTERNAL SOIL NITROGEN CYCLING, PLANT NITROGEN STATUS AND NITROGEN LOSSES. view more 

CREDIT: RACHEL MASON

NEW YORK, April 15, 2022 – Since the mid-20th century, research and discussion have focused on the negative effects of excess nitrogen on terrestrial and aquatic ecosystems. However, new evidence indicates that the world is now experiencing a dual trajectory in nitrogen availability. Following years of attention to surplus nitrogen in the environment, our evolving understanding has led to new concerns about nitrogen insufficiency in areas of the world that do not receive significant inputs of nitrogen from human activities. In a new review paper, "Evidence, Causes, and Consequences of Declining Nitrogen Availability in Terrestrial Ecosystems,” in the journal Science, a multi-institutional team of researchers describes the causes of declining nitrogen availability and how it affects ecosystem function.

“There is both too much nitrogen and too little nitrogen on Earth at the same time,” said Rachel Mason, lead author on the paper and former postdoctoral scholar at the National Socio-Environmental Synthesis Center.

Over the last century, humans have more than doubled the global supply of reactive nitrogen through industrial and agricultural activities. This nitrogen becomes concentrated in streams, inland lakes, and coastal bodies of water, sometimes resulting in eutrophication, low-oxygen dead zones, and harmful algal blooms. These negative impacts of excess nitrogen have led scientists to study nitrogen as a pollutant. However, rising carbon dioxide and other global changes have increased demand for nitrogen by plants and microbes, and the research team’s newly published paper demonstrates that nitrogen availability is declining in many regions of the world, with important consequences for plant growth.

“These results show how the world is changing in complex and surprising ways,” said Peter Groffman, a co-author on the paper and a professor with the Advanced Science Research Center at the CUNY Graduate Center’s Environmental Science Initiative. “Our findings show the importance of having long-term data as well as focused synthesis efforts to understand these changes and the implications they have for ecosystem and human health and well-being.”

Researchers reviewed long-term global and regional studies and found evidence of declining nitrogen availability caused by multiple environmental changes, one being elevated atmospheric carbon dioxide levels. Atmospheric carbon dioxide has reached its highest level in millions of years, and terrestrial plants are exposed to about 50% more of this essential resource than just 150 years ago. Elevated atmospheric carbon dioxide fertilizes plants, allowing faster growth but diluting plant nitrogen in the process. These processes have been observed in experiments that artificially elevate carbon dioxide in the air around plants, and there is now evidence that plants in natural settings are responding in the same way.

Nitrogen is an essential element for plants and the animals that eat them. Gardens, forests, and fisheries are all more productive when they are fertilized with nitrogen. If plant nitrogen becomes less available, trees grow more slowly and their leaves are less nutritious to insects, potentially reducing growth and reproduction, not only of insects, but also the birds and bats that feed on them.

“When nitrogen is less available, every living thing holds on to the element for longer, slowing the flow of nitrogen from one organism to another through the food chain. This is why we can say that the nitrogen cycle is seizing up,” said Andrew Elmore, senior author on the paper, and a professor of landscape ecology at the University of Maryland Center for Environmental Science and at the National Socio-Environmental Synthesis Center.

On top of increasing atmospheric carbon dioxide, rising global temperatures also affect plant and microbial processes associated with nitrogen supply and demand. Warming often improves conditions for growth, which can result in longer growing seasons, leading plant nitrogen demand to exceed the supply available in soils. Disturbances, including wildfires, can also remove nitrogen from systems and reduce availability over time.

Nitrogen is an essential element for plant growth and its declining availability has the potential to constrain the ability of plants to remove carbon dioxide from the atmosphere. Currently, global plant biomass stores nearly as much carbon as is contained in the atmosphere, and biomass carbon storage increases each year. To the extent plant storage of carbon reduces atmospheric carbon dioxide, it contributes to reductions in the global warming potential of the atmosphere. However, declining nitrogen availability jeopardizes the annual increase in plant carbon storage by imposing limitations to plant growth. Therefore, climate change models that attempt to estimate carbon stored in biomass, including trends over time, need to account for nitrogen availability.

“Despite strong indications of declining nitrogen availability in many places and contexts, spatial and temporal patterns are not yet well enough understood to efficiently direct global management efforts,” said Elmore. In the future, these data could be assembled into an annual state of the nitrogen cycle report or a global map of changing nitrogen availability that would represent a comprehensive resource for scientists, managers, and policy-makers.

 

About the Advanced Science Research Center

The Advanced Science Research Center at the CUNY Graduate Center(CUNY ASRC) is a world-leading center of scientific excellence that elevates STEM inquiry and education at CUNY and beyond. The CUNY ASRC’s research initiatives span five distinctive, but broadly interconnected disciplines: nanoscience, photonics, neuroscience, structural biology, and environmental sciences. The center promotes a collaborative, interdisciplinary research culture where renowned and emerging scientists advance their discoveries using state-of-the-art equipment and cutting-edge core facilities.

About The Graduate Center of The City University of New York

The CUNY Graduate Center is a leader in public graduate education devoted to enhancing the public good through pioneering research, serious learning, and reasoned debate. The Graduate Center offers ambitious students nearly 50 doctoral and master’s programs of the highest caliber, taught by top faculty from throughout CUNY — the nation’s largest urban public university. Through its nearly 40 centers, institutes, initiatives, and the Advanced Science Research Center, the Graduate Center influences public policy and discourse and shapes innovation. The Graduate Center’s extensive public programs make it a home for culture and conversation.

###

 

 

Open sharing of biotechnology research – transparency versus security

Peer-Reviewed Publication

PLOS

Open sharing of biotechnology research – transparency versus security 

IMAGE: RESEARCHERS OUTLINE HOW TO BALANCE THE OPEN SHARING OF BIOTECH RESEARCH WITH TRANSPARENCY AND SECURITY view more 

CREDIT: SANGHARSH LOHAKARE, UNSPLASH (CC0, HTTPS://CREATIVECOMMONS.ORG/PUBLICDOMAIN/ZERO/1.0/)

As biotechnology advances, the risk of accidental or deliberate misuse of biological research like viral engineering is increasing. At the same time, “open science” practices like the public sharing of research data and protocols are becoming widespread. An article publishing April 14th in the open access journal PLOS Biology by James Smith and Jonas Sandbrink at the University of Oxford, UK, examines how open science practices and the risks of misuse interface and proposes solutions to the problems identified.

The authors grapple with a critically important issue that emerged with the advent of nuclear physics: how the scientific community should react when two values – security and transparency – are in conflict. They argue that in the context of viral engineering, open code, data, and materials may increase the risk of the release of enhanced pathogens. Openly available machine learning models could reduce the amount of time needed in the laboratory and make pathogen engineering easier.

To mitigate such catastrophic misuse, mechanisms that ensure responsible access to relevant dangerous research materials need to be explored. In particular, to prevent the misuse of computational tools, controlling access to software and data may be necessary.

Preprints, which have become widely used during the pandemic, make preventing the spread of risky information at the publication stage difficult. In response, the authors argue that oversight needs to take place earlier in the research lifecycle. Lastly, Smith and Sandbrink highlight that research preregistration, a practice promoted by the open science community to increase research quality, may harbor an opportunity to review and mitigate research risks.

“In the face of increasingly accessible methods for the creation of possible pandemic pathogens, the scientific community needs to take steps to mitigate catastrophic misuse,” say Smith and Sandbrink. “Risk mitigation measures need to be fused into practices developed to ensure open, high-quality, and reproducible scientific research. To make progress on this important issue, open science and biosecurity experts need to work together to develop mechanisms to ensure responsible research with maximal societal benefit.”

The authors propose several of those mechanisms, and hope that the research will spur innovation in this critically important yet critically neglected area. They show that science cannot be just open or closed: there are intermediate states that need to be explored, and difficult trade-offs touching on core scientific values may be needed. “In contrast to the strong narrative towards open science that has emerged in recent years, maximizing societal benefit of scientific work may sometimes mean preventing, rather than encouraging, its spread,” they conclude.

#####

In your coverage, please use this URL to provide access to the freely available paper in PLOS Biology:   http://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.3001600

Citation: Smith JA, Sandbrink JB (2022) Biosecurity in an age of open science. PLoS Biol 20(4): e3001600. https://doi.org/10.1371/journal.pbio.3001600

Author Countries: United Kingdom

Funding: JAS received support from the Effective Altruism Funds programme via the Long Term Future Fund (https://funds.effectivealtruism.org/funds/far-future). JAS’s postdoctoral position is funded by the Oxford National Institute for Health Research Biomedical Research Centre (https://oxfordbrc.nihr.ac.uk/). JBS's doctoral research is funded by Open Philanthropy (https://www.openphilanthropy.org/). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

For climate change mitigation, bipartisan politics can work

CU Boulder study: When Democrats and Republicans unite, more climate bills pass

Peer-Reviewed Publication

UNIVERSITY OF COLORADO AT BOULDER

In an increasingly polarized nation, cooperation across party lines is key to sustained climate mitigation in the United States, according to a new CIRES study. To sustain climate progress over decades, bipartisan cooperation on solutions like renewable energy or emissions reduction will be necessary, the authors say. 

“In the long run, climate change mitigation will only be successful with public and political unity behind it,” said Renae Marshall, former CIRES researcher, Ph.D. student at the University of California Santa Barbara, and lead on the study out now in the current issue of Climatic Change. “Our study found that bipartisanship can help create working climate mitigation strategies in state-level contexts.”  

Marshall and her coauthor Matt Burgess, CIRES Fellow and CU Boulder Assistant Professor in Environmental Studies and Economics, analyzed 418 state-government enacted bills and 450 failed bills aimed at reducing emissions from 2015 to 2020, to identify the political contexts in which they were passed or defeated. 

The duo found that even though two-thirds of the climate-related bills passed in Democrat-controlled legislatures, one-third passed in Republican-controlled legislatures.   

Additionally, about a third of all analyzed bills had cosponsors from both major parties, suggesting there are still opportunities for bipartisanship.

Marshall and Burgess found that bipartisan or Republican-led bills favored financial incentives for renewable energy, as well as legislation that expands consumer choices—whereas Democrat-led bills favored those that restricted choice, such as mandatory renewable energy and emissions standards. “Key bipartisan opportunities at the state level are policies that not only provide financial incentives (such as renewable energy system tax credits), but also have an element of expanding opportunities for businesses and consumers to take part in the renewable transition (creating new consumer protections and financing options or allowing new sources of energy to participate in the marketplace),” said Marshall.

For example: Georgia found success when the 2015 Solar Power Free-Market Finance Act lifted restrictions that had previously kept the solar market from growing by allowing individuals and businesses to participate in lease financing agreements.

More bipartisan bills were proposed in ‘divided’ states (like Kentucky or New Hampshire) compared to Democrat- or Republican-dominated states, the team said, suggesting that equal representation on both sides of the political spectrum creates a better environment for cooperation on climate bills. 

“The more polarized we get, the more of a barrier there is. Working together across party lines is the solution,” said Marshall. “Bipartisanship opens up new opportunities to find common ground and dive into sustained climate initiatives.” 

The study grew out of Marshall’s undergraduate Honors thesis in Environmental Studies at CU Boulder, which earned her recognition as the College of Arts and Sciences Outstanding Graduate in 2021.

“Bipartisanship has its challenges—but it’s worth it,” Burgess said. “Not only will more successful bipartisan bills increase the amount climate mitigation strategies put in place, bipartisanship in climate decisions might actually help shrink the political polarization in our country as a whole. Previous research has found that working towards shared goals reduces inter-group conflict.”

Racial and ethnic disparities in telemedicine usage persist during pandemic

UH study finds minorities dealing with access to care issues

Peer-Reviewed Publication

UNIVERSITY OF HOUSTON

Historical data shows minorities have long faced obstacles to getting the critical health care services they need. When COVID-19 arrived two years ago, telemedicine emerged with the promise of better access to care through virtual delivery of clinical services and consultations.

But according to a new study led by the University of Houston College of Medicine and published in the Journal of General Internal Medicine, the rapid implementation of telemedicine didn’t bridge the gap as much as people had hoped.

“We found that racial and ethnic disparities persisted,” said lead study author Omolola Adepoju, a clinical associate professor at the UH College of Medicine and director of research at the Humana Integrated Health Sciences Institute at UH. “This suggests that the promise of the positive impact of telemedicine on health care use and health outcomes could elude underserved populations.”

Adepoju partnered with Lone Star Circle of Care, a federally qualified health center (FQHC) that caters to indigent, uninsured and underinsured, mostly minority populations, to examine what was driving those disparities. The research team examined electronic medical records from 55 individual clinics in 6 different counties in Texas.

“Our main finding was African Americans were 35% less likely to use telemedicine compared to whites,” Adepoju said. “And Hispanics were 51% less likely to use it.”

The reason, the study found, was a huge digital divide.

“The people who really need to access their primary care providers might be cut out [of telemedicine] because they don’t have the technology or might not know how to use it,” Adepoju said.

According to Adepoju, only one in four families earning $30,000 or less have smart devices, such as a phone, tablet, or laptop, compared to nearly three in four families earning $100,000 or more. And only 66% of African American and 61% of Hispanic households have access to broadband internet compared to 79% of white households.

The study also found that individuals younger than 18 years and older adults were less likely to have a telemedicine visit when compared to non-elderly adults, as were those covered under Medicaid coverage, or uninsured.

Another factor that played a role was how far from someone lived from a clinic.

“We observed a dose-response to geographic distance so that the further a patient lived, the higher the likelihood of telemedicine use,” Adepoju said. “The type of visit, whether for an acute or non-acute condition, was also associated with telemedicine use. Non-acute visits were more likely to be conducted via telemedicine.”

Despite the recent easing of COVID-19 restrictions and people returning to more in-person care, telemedicine is here to stay. The hope, according to Adepoju, is that minorities will be better educated and equipped to take advantage of it.

But they’ll need someone who can walk them through it to ensure their appointments are meaningful.

“Clinics will need a technology support system,” she said. “A staff that conducts pre-visit device and connectivity testing with patients can be instrumental to helping patients maximize telemedicine as an access to care option.

Going beyond the limit: WVU researcher develops novel exposure assessment statistical methods for Deepwater Horizon oil spill study

Peer-Reviewed Publication

WEST VIRGINIA UNIVERSITY

Deepwater Horizon Beach Cleanup 

IMAGE: THE 2010 DEEPWATER HORIZON OIL SPILL INVOLVED OVER 9,000 VESSELS DEPLOYED IN THE GULF OF MEXICO WATERS ACROSS ALABAMA, FLORIDA, LOUISIANA AND MISSISSIPPI AND TENS OF THOUSANDS OF WORKERS ON THE WATER AND ON LAND. view more 

CREDIT: SUBMITTED PHOTO/NIEHS

Nearly 12 years after the Deepwater Horizon oil spill, scientists are still examining the potential health effects on workers and volunteers who experienced oil-related exposures.

To help shape future prevention efforts, one West Virginia University researcher – Caroline Groth, assistant professor in the School of Public Health’s Department of Epidemiology and Biostatistics – has developed novel statistical methods for assessing airborne exposure. Working with collaborators from multiple institutions, Groth has made it possible for researchers to characterize oil spill exposures in greater detail than has ever been done before.

With very few Ph.D. biostatisticians in the area of occupational health, there were few appropriate statistical methodologies for the assessment of inhalation exposures for the GuLF STUDY, a study launched by the National Institute of Environmental Health Sciences shortly after the Deepwater Horizon oil spill. The purpose of the study, which is the largest ever following an oil spill: examine the health of persons involved in the response and clean-up efforts. Groth was part of the exposure assessment team tasked with characterizing worker exposures and led by Patricia Stewart and Mark Stenzel.

Groth’s statistical methods, which she began in 2012, laid the framework for a crucial step for determining whether there are associations between exposures and health outcomes from the oil spill and clean-up work, which involved over 9,000 vessels deployed in the Gulf of Mexico waters across Alabama, Florida, Louisiana and Mississippi and tens of thousands of workers on the water and on land.  

The Deepwater Horizon oil spill is considered the largest marine oil spill in the history of the U.S.

“Workers were exposed differently based on their activities, time of exposure, etc., and our research team’s goal was to develop exposure estimates for each of those scenarios and then link them to the participants’ work history through an ‘exposure matrix,’” Groth said.

These methods make it possible for other researchers to estimate individuals’ levels of exposure and link it to their health outcomes.  

Additionally, Groth uncovered a new way of accounting for exposures that instruments cannot detect. The threshold at which exposures cannot be detected is referred to as the LOD, or limit of detection. Groth’s methods go beyond that limit, accounting for the fact that there is uncertainty in exposure measurements below the LOD.

“Basically, what happens is the instrument reports undetectable, or ‘zero,’" Groth explained. “Previously, less reliable approaches were likely used – such as replacing it with a single value or forecasting. Those approaches do not consider actual variability in the data which, if not considered, can lead to inaccurate exposure estimates. However, we know with certainty they cannot be ‘zero.'

“We know it’s between that threshold and zero, and there is likely variability in these measurements that we should account for. Our methods allow us to account for this variability and get a quantitative estimate of concentration.”

Her findings, along with her team’s, were recently published in the Annals of Work Exposures and Health. Dale Sandler, chief of the Epidemiology Branch and senior investigator at the National Institute of Environmental Health Sciences, said the efforts of Groth – who served as primary author for two of the manuscripts and co-author for eight published manuscripts – and her colleagues have opened new doors.

“The Gulf Long-term Follow-up Study is larger and more long-term than research on other oil spills, but the major defining feature of the study is the level of detail on potential oil-spill exposures and the extensive efforts made to characterize the exposures of those who helped to clean up following this environmental and potential public health disaster,” Sandler, principal investigator of the study, said. “Dr. Groth, who has played a key role in characterizing the chemical exposures of persons participating in the GuLF Study, and her colleagues have allowed us to characterize respiratory exposures to a broad class of chemicals resulting from the oil spill.” 

Research continues to be conducted, with plans to continue following these workers for additional health effects going forward to determine if any exposures were associated with detrimental health outcomes. Both Groth and Sandler see the effort as an important step to identifying factors that contribute to long-term safety and health.

Sandler added, “This will help us identify links between specific exposures and health effects and could help us identify targets for future prevention efforts.”

Factors including extreme winds, topography and vegetation influenced the severity of burns from Oregon's devastating 2020 megafires

First-of-its-kind study examines the influence of factors that contributed to patterns of high-burn severity during the 2020 megafires in Oregon

Peer-Reviewed Publication

PORTLAND STATE UNIVERSITY

False-color satellite imagery of the Riverside fire (with fire perimeter added) 

IMAGE: SATELLITE IMAGE OF THE RIVERSIDE FIRE IN OREGON, 2020. view more 

CREDIT: CODY EVERS

In early September 2020, severe winds, high heat, and prolonged drought conditions led to the explosive growth of wildfires along the western slopes of the Cascades Mountains in the Pacific Northwest. The fires engulfed enormous tracts of forestland, destroyed communities, took dozen of lives, and cost hundreds of millions to fight.

In a first-of-its-kind study examining burn patterns from the 2020 Labor Day fires, researchers at Portland State University studied the influence of weather, topography, vegetation and other factors on burn severity in areas where the fires killed more than 75% of the trees. Their research confirms that extreme winds over the Labor Day holiday were the primary driver of the destructive force of the fires yet demonstrates how forest vegetation structure (e.g., canopy height, the age of trees, etc.) and topography played a significant role in burn severity patterns.

The paper, "Extreme Winds Alter Influence of Fuels and Topography on Megafire Burn Severity in Seasonal Temperate Rainforests under Record Fuel Aridity," was recently published in the journal Fire.

According to the study's co-author, Andrés Holz, associate professor of geography at Portland State, the wet temperate forests of the Cascade Mountains in the Pacific Northwest have a history of experiencing megafires of the scale of those that burned in 2020, but none had occurred since the early twentieth century. Because the scope and scale of the burns were unprecedented in modern times, they provided the research team a unique opportunity to gain a better understanding of the factors that influence the high severity of burns in these rainforests, including those on the western slopes of the Cascades. That understanding can inform planning for future land-use management in forestlands and the social and ecological impacts of extreme fire events in the context of a warming planet.

The research team developed maps for the extent and burn severity for five megafires and examined fire activity over two time periods: September 7-9, 2020, during which extreme winds fueled the explosive growth of the fires, and September 10-17, 2020, during which the fires continued burning under calm wind conditions. They then examined how the forest structure and topography influenced high-burn severity patterns, whether winds affected the relationship between those factors, and how high burn severity was affected by land management practices associated with land ownership.

"90% of the burning occurred during high winds," said Dr. Cody Every, a Research Associate in the Department of Environmental Science and Management at Portland State and the study's lead author. "But we also found that vegetation structure and canopy height were significant in determining where the fire burned more severely."

The research team found that areas with younger trees and low canopy height and cover were particularly susceptible to high mortality rates. As Holz pointed out, this finding is of particular consequence to lumber production in the state, where trees grown on plantations are typically younger, uniformly spaced and located near communities and critical infrastructure.

Drawing on the historical record, the team, which included Portland State researchers Dr. Sebastian Busby and Associate Professor Max Nielsen-Pincus, also suggests that wildfire managers should anticipate re-burns in some areas affected by the 2020 megafires. Recently burned forests typically have higher flammability than unburned areas until the younger forest canopy closes again and finer fuels are shaded.

Given the composition of the temperate rainforests of the western slopes of the Cascade Mountains, where fuel proliferates, and the relationships between factors that contribute to megafires, the research team suggests that treatments such as prescribed fires and fuel reduction are not a practical approach to preventing future conflagrations. Instead, the team argues that we should focus on promoting resilient forests, increasing community preparedness, early suppression response, and hardening infrastructure.

Seafloor spreading has been slowing down

Sluggish spreading rates could mean a drop in greenhouse gas emissions from volcanoes

Peer-Reviewed Publication

AMERICAN GEOPHYSICAL UNION

WASHINGTON—A new global analysis of the last 19 million years of seafloor spreading rates found they have been slowing down. Geologists want to know why the seafloor is getting sluggish.

New oceanic crust forms continuously along rifts thousands of miles long on the seafloor, driven by plate tectonics. As subduction pulls old crust down, rifts open up like fissures in an effusive volcano, drawing hot crust toward the surface. Once at the surface, the crust begins to cool and gets pushed away from the rift, replaced by hotter, younger crust.

This cycle is called seafloor spreading, and its rate shapes many global processes, including sea level and the carbon cycle. Faster rates tend to cause more volcanic activity, which releases greenhouse gases, so deciphering spreading rates helps contextualize long-term changes in the atmosphere.

Today, spreading rates top out around 140 millimeters per year, but peaked around 200 millimeters per year just 15 million years ago in some places, according to the new study. The study was published in the AGU journal Geophysical Research Letters, which publishes high-impact, short-format reports with immediate implications spanning all Earth and space sciences.

The slowdown is a global average, the result of varying spreading rates from ridge to ridge. The study examined 18 ridges, but took a particularly close look at the eastern Pacific, home to some of the globe’s fastest spreading ridges. Because these slowed greatly, some by nearly 100 millimeters per year slower compared to 19 million years ago, they dragged down the world’s average spreading rates.

It's a complex problem to solve, made more difficult by the seafloor’s slow and steady self-destruction.

“We know more about the surfaces of some other planets than we do our own seafloor,” said Colleen Dalton, a geophysicist at Brown University who led the new study. “One of the challenges is the lack of perfect preservation. The seafloor is destroyed, so we’re left with an incomplete record.”

The seafloor is destroyed in subduction zones, where oceanic crust slides under continents and sinks back into the mantle, and is reforged at seafloor spreading ridges. This cycle of creation and destruction takes about every 180 million years, the age of the oldest seafloor. The crust’s magnetic record tracks this pattern, producing identifiable strips every time the Earth’s magnetic field reverses.

Dalton and her co-authors studied magnetic records for 18 of the world’s largest spreading ridges, using seafloor ages and their areas to calculate how much ocean crust each ridge has produced over the last 19 million years. Each ridge evolved a little differently: some lengthened, some shrank; some sped up, but almost all slowed down. The overall result of Dalton’s work is that average seafloor spreading slowed down by as much as 40% over that time.

The driver here might be located at subduction zones rather than spreading ridges: for example, as the Andes grow along the western edge of the South American continent, the mountains push down on the crust.

“Think of it as increased friction between the two colliding tectonic plates,” Dalton said. “A slowdown in convergence there could ultimately cause a slowdown in spreading at nearby ridges.”  A similar process could have operated underneath the Himalaya, with the rapidly growing range slowing spreading along the ridges in the Indian Ocean.

However, Dalton points out, this added friction can’t be the only driver of the slowdown, because she found slowing rates globally and mountain growth is regional. Larger-scale processes, like changes in mantle convection, could also be playing a role. In all likelihood, she concludes, it’s a combination of both. To learn more, Dalton hopes to collect absolute plate speeds, rather than the relative speeds used in this study, which will better allow her to determine the cause of the slowdown.

###

AGU press contact:
Rebecca Dzombak, +1 (202 777-7492), news@agu.org (UTC-4 hours)

Contact information for the researchers:
Colleen Dalton, Brown University, colleen_dalton@brown.edu (UTC-4 hours)

AGU (www.agu.org) supports 130,000 enthusiasts to experts worldwide in Earth and space sciences. Through broad and inclusive partnerships, we advance discovery and solution science that accelerate knowledge and create solutions that are ethical, unbiased and respectful of communities and their values. Our programs include serving as a scholarly publisher, convening virtual and in-person events and providing career support. We live our values in everything we do, such as our net zero energy renovated building in Washington, D.C. and our Ethics and Equity Center, which fosters a diverse and inclusive geoscience community to ensure responsible conduct.

*****

Notes for Journalists:
This research study is freely available until the end of the month. Download a PDF copy of the paper here. Neither the paper nor this press release is under embargo.

Paper title:

“Evidence for a Global Slowdown in Seafloor Spreading Since 15 Ma”

Author information:      

  • Colleen Dalton (corresponding author), Timothy Herbert, Department of Earth, Environmental, and Planetary Sciences, Brown University, Providence, RI, USA
  • Douglas S. Wilson, Marine Science Institute, University of California-Santa Barbara, CA, USA

New research highlights the role of green spaces in conflict

Peer-Reviewed Publication

UNIVERSITY OF BRITISH COLUMBIA

New research highlights the role of green spaces in conflict 

IMAGE: UBC LANDSCAPE ARCHITECTURE PROFESSOR FIONN BYRNE view more 

CREDIT: LOU CORPUZ-BOSSHART/UBC

Green spaces can promote well-being, but they may not always be benign. Sometimes, they can be a tool for control.

That’s the gist of a new paper that analyzed declassified U.S. military documents to explore how the U.S. forces used landscapes to fight insurgency during the war in Afghanistan.

Author Fionn Byrne, an assistant professor at UBC’s school of architecture and landscape architecture, focused on four projects that ranged in scale from individual tree plantings to large-scale reforestation efforts. Funds for each project came through the Commander Emergency Response Program, a multibillion-dollar program designed to win over the hearts and minds of the Afghan people.

“Previous research by others shows that exposure to trees has measurable positive impacts on physical and mental health,” said Byrne. “These gains in overall health are linked to a more peaceful society. Therefore, I argue that trees, and green spaces in general, can be considered a noncoercive mode of warfare. They can further social cohesion and diminish the likelihood of insurgency.” 

For example, in the project Route Francine Green Space, the U.S. military improved a site adjacent to a road in Kandahar Province by planting trees and building playgrounds and other amenities. Route Francine is part of a district that had a high rate of IED detonations, so not only did the project beautify the landscape, but it also helped garner support for the local government and reduced instability in the region.

Alternatively, the Panjshir Valley Green Belt project created jobs for residents by replanting 35,000 trees. Research already shows us that a new forest can influence the mental condition of an entire population, with many individuals gaining from being exposed to nature. A landscape intervention of this type is thus an instance of population-wide psychological modification.

Byrne adds that the paper highlights a gap in current scholarship. Most research has emphasized the effects of war on the landscape rather than investigating how the landscape itself is mobilized as a warfighting tool. Even when researchers have studied how the landscape has been used as a weapon, they have focused on large-scale and destructive manipulation of the environment to achieve direct military objectives. He cited a recent piece in the New York Times that follows this pattern.

“War is rightly associated with death, so, when we see images of U.S. forces planting trees and fostering new life, it is worth looking at this closely,” said Byrne. “We need to study further how militaries have used landscape design in more subversive modes, distinct from an overt weaponization of the environment. This paper demonstrates that using tree planting to impact mental health is a nonviolent, subtle and potentially unchallenged pathway to subdue resistance from a local population.”

He added that this research can provide a lens to study the landscape changes of past wars. It can also help us understand that the landscape remains implicated in many conflicts, including the ongoing effects of colonization and other territorial struggles. Further research will need to examine the specific legacy impacts of past landscape changes.

“Though it is beyond the scope of this paper, I can add that landscape architects need to understand better the role of the profession in, for example, tree-planting efforts. I hope my research makes us question the benign good of tree planting and reminds us that green spaces are neither neutral nor apolitical.”

Interview language(s): English

Huge Amazon swamp carbon stores under threat, study says

The largest peatlands in the Amazon rainforest, which hold a vast, concentrated amount of carbon, are under increasing threat from changing land use, research suggests.

Peer-Reviewed Publication

UNIVERSITY OF EDINBURGH

Palm swamp in lowland Peruvian Amazonia (LPA) 

IMAGE: PALM SWAMP IN LOWLAND PERUVIAN AMAZONIA (LPA) view more 

CREDIT: CREDIT IAN T. LAWSON, UNIVERSITY OF ST ANDREWS, UK.

Urgent protection is needed to prevent carbon gas emissions from decomposing peat swamps in lowland Peruvian Amazonia (LPA) – which are bigger than previously thought.

Scientists discovered small but growing areas of deforestation across the LPA, including an 11-fold increase in CO2 emissions linked to mining, between 2000 and 2016.

The research, led by the Universities of Edinburgh and St Andrews used field, satellite and land-cover data to estimate harmful greenhouse gas emissions, develop maps and create the first data-driven peat thickness models of Peru’s tropical peatlands.

Field teams including scientists from Peru's Insituto de Investigaciones de la Amazonía Peruana, the University of Leeds and other collaborating institutions mapped new stretches of peat swamps and estimated the distribution of peat across Peruvian Amazonia for the first time.

At 62,714 km2 – an area approximately the size of Sri Lanka – the peatlands contain twice as much carbon as previously estimated.

Peat in the LPA stores around 5.4 billion tonnes of carbon, which is almost as much as all of Peru’s forests but in just five percent of its land area, showing how valuable a resource these peatlands are, experts say.

Tropical peatlands are among the most carbon dense ecosystems in the world but agriculture expansion, infrastructure development and mining has led to the loss of large peatland areas.

Deforestation and drainage inhibits the accumulation of essential organic matter in the swamps and promotes rapid decomposition of peat, which in turn releases large quantities of carbon dioxide and nitrous oxide into the atmosphere. 

Drained peatlands are also prone to fires which can lead to a large and rapid increase of emissions.

In recognition of these threats, Peru has passed legislation which, for the first time, mandates the explicit protection of its peatlands for climate-change mitigation.

Enforcing this legislation will depend on continued mapping of peatland distribution and upon further investigation of its carbon storage.

Dr Adam Hastie, Postdoctoral Researcher from the School of GeoSciences, who led the study, said: “We knew that Peru contained substantial peatlands but we previously only had ground data from a few regions, and we didn't realise how extensive the peatlands were.

Our high-resolution maps can be used to directly inform conservation and climate mitigation policies and actions such as Nationally Determined Contributions to the Paris Agreement, to avoid further degradation and CO2 emissions.”

Dr Ian Lawson, Senior Lecturer from the University of St Andrews, who led the international team, said: “Peatlands are increasingly recognized as carbon hotspots and a key component of the planet’s carbon cycle. They store half of all the soil carbon on the planet, but they’re vulnerable to human pressures. It’s important for all of us that we know where they are so that we can protect them and help to mitigate climate change.

This work is the latest result of more than a decade of sustained international collaboration. It has taken a lot of effort by the team, making measurements and collecting samples throughout the swamp forests, to produce this first map of peatlands covering all of Peru’s Amazonian region. The next step is to apply the same methods in other parts of the Amazon Basin. There’s still a lot to be learned.”

Dr Dennis del Castillo Torres, from the Instituto de Investigaciones de la Amazonia Peruana and project partner of the study, said: “Our peatlands in Peru have the potential to mitigate climate change because the sustainable use of the most abundant peatland palm species, Mauritia flexuosa, can be promoted.”

Dr Euridice Honorio Coronado, NERC Knowledge Exchange Fellow at the University of St Andrews and co-author, added: “Conserving peatlands will also support livelihoods and prevent a situation like South-East Asia where almost 80 per cent of peatlands have been cleared and drained."

The study, published in Nature GeoScience, was funded by NERC, Leverhulme Trust, Gordon and Betty Moore Foundation, Wildlife Conservation Society, Concytec/British Council/Embajada Británica, Lima/Newton Fund, the governments of the United States of America & Norway Knowledge Exchange Fellowship.

The team thanked SERNANP, SERFOR and GERFOR for providing research permits, and the indigenous and local communities, research stations and tourist companies for giving consent and allowing access to the forests.

For further information, please contact: Rhona Crawford, Press and PR Office, 07876391498 rhona.crawford@ed.ac.uk