Friday, January 28, 2022

 Tackling PPE waste: Engineers propose sustainable recycling method

Peer-Reviewed Publication

CORNELL UNIVERSIT

ITHACA, N.Y. – Under the intensity of a prolonged pandemic, the world finds an ever-growing and seemingly never-ending waste stream of used surgical masks, plastic face shields, and medical gloves and gowns. Cornell University engineers now offer a solution to sustainably reroute the discarded material.

A medium-temperature reaction called pyrolysis can reduce the plasticized medical-protection garb back into an original form – such as chemicals and petroleum – and then recycle it, perhaps into fuels, according to a new study.

The method involves no incineration or landfill use.

Xiang Zhao, a doctoral student, working with his advisor Fengqi You, professor in energy systems engineering, published the proposed technology framework in the journal Renewable and Sustainable Energy Reviews.

Their framework – first focusing on New York state – proposes collecting waste PPE from hospitals and medical centers, and then sending it to pre-processing and decontamination facilities in New York or Suffolk counties. There, it would be shredded, sterilized and dehydrated to become small particles, and then brought to an integrated pyrolysis plant, like one contemplated for Rockland County, north of New York City.

In the case of You and Zhao’s model, the medium-temperature pyrolysis (about 1,200 degrees Fahrenheit) can deconstruct the plasticized gowns and gloves, which are derived from petroleum, into chemicals such as ethylene, butane, gasoline, bauxite, propene, propane, diesel, light naphtha and sulfur.

“For an analogy, pyrolysis is similar to baking in an oven,” said You, a senior faculty fellow at the Cornell Atkinson Center for Sustainability. “If you set the oven temperature very high, your meat becomes a chunk and charcoal. But if you use a lower oven temperature, the meat is going to be juicy. In pyrolysis, the temperature is the trick.”

Health care facilities around the world are creating about 7.5 pounds per person of PPE waste daily through COVID-19-associated services, according to the United Nations Environment Programme.

To get a sense of the enormity of the disposal dilemma, one hospital with 300 medical personnel could generate more than a ton of medical garb waste daily. That translates to more than 400 tons of annual medical PPE waste in a single COVID-handling facility, You said.

In the paper’s energy analysis and environmental lifecycle assessment, the proposed optimal PPE processing system avoids 41.52% of total landfilling and 47.64% of the incineration processes. This method shows an environmental advantage by reducing total greenhouse gas emissions by 35.42% emissions from conventional incineration and energy saving by 43.5% from landfilling, the researchers said.

“This is a viable strategy for disposing and processing waste PPE,” You said. “It is a treatment method with low greenhouse gas emissions, it alleviates fossil fuel emission depletion and it saves a lot of polluting material from landfills.”

Funding for this research was provided by Cornell’s David M. Einhorn Center for Community Engagement and Cornell Atkinson.

Population Council awarded NIH Center Grant to develop first non-hormonal vaginal ring for pregnancy and STI prevention


Grant and Award Announcement

POPULATION COUNCIL

The Population Council’s Center for Biomedical Research has been awarded an $11 million P50 Clinical Research Center Grant from the Eunice Kennedy Shriver National Institute for Child Health and Development (NICHD) of the National Institute of Health (NIH). Queen’s University Belfast and Weill Cornell Medical College will partner with the Council on this grant over the next five years.

The grant will spur research and development of a novel non-hormonal contraceptive multi-purpose technology (MPT) vaginal ring that will combat the overlapping burdens of unintended pregnancy and sexually transmitted infections, including HIV. The product will fill a critical gap in reproductive healthcare, responding to women’s evolving preferences and reproductive needs.

“This single product has potential to address a wide range of sexual and reproductive needs including protection against sexually transmitted infections, contraception, and support of vaginal health,” said Lisa Haddad, MD, MPH, Council Medical Director and the Principal Investigator on the grant. “Women need more options to manage their changing sexual and reproductive health needs. The non-hormonal MPT ring offers hope of an important new contraceptive option that could provide women with protection from the growing risk of STIs.”

NON-HORMONAL OPTION

Many women want to avoid hormonal methods and the associated side effects. A non-hormonal method would allow women to maintain a regular menses cycle without loss of menstruation (amenorrhea) or unscheduled and irregular bleeding. Currently, non-hormonal options available are limited to gels or long-acting methods that require a physician to insert, such as the copper IUD.

PROTECTION AGAINST STIS AND HIV

Sexually transmitted infections are on the rise worldwide, with more than 1 million new cases every day. In 2019, approximately 1.7 million people became newly infected with HIV globally. Increasing antibiotic resistance makes it more difficult for healthcare providers to treat STIs, which currently costs more than $2 billion annually. Despite national efforts to reduce STI transmission, the US experienced steep, sustained increases for five years—reporting more than 1.7 million cases of chlamydia and 555,000 cases of gonorrhea in 2018.

STIs pose broad-reaching risks for women and their children, including the increased risk of HIV acquisition and transmission, chronic pelvic pain, infertility, and preterm delivery—the leading cause of infant morbidity and mortality.

Recent data indicate that women overwhelmingly prefer, and are more likely to use, contraception that prevents both pregnancy and STIs/HIV. To build further understanding of women’s preferences, the grant will also fund behavioral and acceptability studies alongside biomedical research on formulation and testing.

“The goal is to have the product ready for clinical trials at the end of the five-year grant. Along the way, we will generate new data about vaginal ring acceptability and the factors that increase acceptability and adherence to vaginal rings,” said Dr. Haddad.

Learn more about the novel non-hormonal contraceptive MPT vaginal ring.

NYU Tandon cybersecurity expert wins NSF CAREER Award for improving software vulnerability testing & education

Brendan Dolan-Gavitt is laying the groundwork for more efficient, less costly vulnerability testing.

Grant and Award Announcement

NYU TANDON SCHOOL OF ENGINEERING

Brendan Dolan-Gavitt, Ph.D. 

IMAGE: BRENDAN DOLAN-GAVITT, ASSISTANT PROFESSOR IN THE DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING view more 

CREDIT: NYU TANDON

BROOKLYN, New York, Thursday, January 27 2022 —The National Science Foundation (NSF) has selected an NYU Tandon School of Engineering researcher who is developing better ways to assess vulnerability discovery tools – thus allowing cybersecurity professionals to better understand what techniques are most effective and ultimately leading to safer software – to receive its most prestigious award for promising young academics.

Brendan Dolan-Gavitt, an assistant professor in the Department of Computer Science and Engineering and a faculty member of NYU’s Center for Cybersecurity, received a 2022 NSF Faculty Early Career Development Award, more widely known as a CAREER Award, which supports early-career faculty who have the potential to serve as academic role models in research and education.

A five-year, $500,000 grant will support a project that aims to create techniques for automatically generating benchmark corpora of software vulnerabilities that can be used to rigorously assess newly developed and existing tools used to root out dangerous programming bugs.

Software vulnerabilities pose a major threat to the safety and security of computer systems, and while there is a large body of research on how to find vulnerabilities in programs, the large, empirically tested corpora of vulnerabilities required to rigorously test that research are difficult and expensive to assemble. 

Although researchers have discovered ways to automatically generate vulnerabilities and inject them into software, the vulnerabilities created in that way are unrealistic (containing artifacts that made them easier to discover than real vulnerabilities inadvertently created by human programmers) and not varied enough.

Dolan-Gavitt intends to address those shortcomings by employing large language models trained on code to synthesize vulnerabilities that are both realistic and diverse, placing vulnerabilities in hard-to-discover paths, allowing new vulnerability classes to be added quickly with a customized domain-specific language, and automatically generating exploits for each vulnerability. The end result will be a limitless supply of highly realistic vulnerability corpora that can be generated cheaply, at scale, and on-demand, giving researchers valuable benchmarks in measuring the efficacy of their cybersecurity tools.  

In addition to his work’s benefit to cybersecurity researchers and industry professionals, it is also expected to be a boon to educators. Since joining NYU Tandon in 2015, Dolan-Gavitt has been involved in CSAW, the most comprehensive student-run cybersecurity event in the world, and among the most popular offerings at the annual event is a “capture the flag” competition that challenges students to find vulnerabilities in a software program. “These types of competitions are an extremely popular and effective means of teaching a variety of cybersecurity skills, but they require large amounts of time, money, and expertise to create and manage,” he explains. “If the creation of the challenges can be partially or wholly automated, it could bring new educational opportunities within reach of a broader and more diverse population of students by dramatically lowering costs and reducing the time and effort needed.” 

“Brendan Dolan-Gavitt is helping place the field of vulnerability finding on solid scientific footing, allowing for repeatable and reproducible experiments and facilitating comparative evaluations of the cyber tools meant to protect us,” said NYU Tandon Dean Jelena Kovačević. “His work has the potential to make a major impact on cybersecurity education, broadening access and helping to build the next generation of security researchers. We’re proud that his techniques will be employed right here in our own cybersecurity courses and at CSAW and pleased that the NSF has chosen him to receive this much-deserved CAREER Award.”

Dolan-Gavitt joins the over 50% of NYU Tandon’s engineering junior faculty members who hold CAREER Awards or similar young-investigator honors, including 10 since 2019 alone.

His award reflects the NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Factors Associated With Opioid Overdose After an Initial Opioid Prescription

Key Points

Question  What factors are associated with an increased risk for opioid overdose after the initial opioid prescription to a previously opioid-naive individual?

Findings  In this cohort study of 236 921 individuals who received a first opioid prescription, 667 experienced an incident opioid overdose. Patient risk factors included being aged 75 years or older, being male, receiving Medicaid or Medicare Advantage coverage, having a comorbid substance use disorder or depression, and having medical comorbidities. Prescription-related risk factors included an initial prescription of oxycodone or tramadol, concurrent use of benzodiazepines, and filling opioid prescriptions from 3 or more pharmacies.

Meaning  Findings from this study suggest that several patient- and prescription-related risk factors are associated with opioid overdose; prescribers, researchers, policy makers, and insurers can apply this information to guide opioid counseling and monitoring, develop clinical decision-making tools, and provide additional opioid prevention and treatment resources to individuals who are at greatest risk for opioid overdose.

Abstract

Importance  The opioid epidemic continues to be a public health crisis in the US.

Objective  To assess the patient factors and early time-varying prescription-related factors associated with opioid-related fatal or nonfatal overdose.

Design, Setting, and Participants  This cohort study evaluated opioid-naive adult patients in Oregon using data from the Oregon Comprehensive Opioid Risk Registry, which links all payer claims data to other health data sets in the state of Oregon. The observational, population-based sample filled a first (index) opioid prescription in 2015 and was followed up until December 31, 2018. Data analyses were performed from March 1, 2020, to June 15, 2021.

Exposures  Overdose after the index opioid prescription.

Main Outcomes and Measures  The outcome was an overdose event. The sample was followed up to identify fatal or nonfatal opioid overdoses. Patient and prescription characteristics were identified. Prescription characteristics in the first 6 months after the index prescription were modeled as cumulative, time-dependent measures that were updated monthly through the sixth month of follow-up. A time-dependent Cox proportional hazards regression model was used to assess patient and prescription characteristics that were associated with an increased risk for overdose events.

Results  The cohort comprised 236 921 patients (133 839 women [56.5%]), of whom 667 (0.3%) experienced opioid overdose. Risk of overdose was highest among individuals 75 years or older (adjusted hazard ratio [aHR], 3.22; 95% CI, 1.94-5.36) compared with those aged 35 to 44 years; men (aHR, 1.29; 95% CI, 1.10-1.51); those who were dually eligible for Medicaid and Medicare Advantage (aHR, 4.37; 95% CI, 3.09-6.18), had Medicaid (aHR, 3.77; 95% CI, 2.97-4.80), or had Medicare Advantage (aHR, 2.18; 95% CI, 1.44-3.31) compared with those with commercial insurance; those with comorbid substance use disorder (aHR, 2.74; 95% CI, 2.15-3.50), with depression (aHR, 1.26; 95% CI, 1.03-1.55), or with 1 to 2 comorbidities (aHR, 1.32; 95% CI, 1.08-1.62) or 3 or more comorbidities (aHR, 1.90; 95% CI, 1.42-2.53) compared with none. Patients were at an increased overdose risk if they filled oxycodone (aHR, 1.70; 95% CI, 1.04-2.77) or tramadol (aHR, 2.80; 95% CI, 1.34-5.84) compared with codeine; used benzodiazepines (aHR, 1.06; 95% CI, 1.01-1.11); used concurrent opioids and benzodiazepines (aHR, 2.11; 95% CI, 1.70-2.62); or filled opioids from 3 or more pharmacies over 6 months (aHR, 1.38; 95% CI, 1.09-1.75).

Conclusions and Relevance  This cohort study used a comprehensive data set to identify patient and prescription-related risk factors that were associated with opioid overdose. These findings may guide opioid counseling and monitoring, the development of clinical decision-making tools, and opioid prevention and treatment resources for individuals who are at greatest risk for opioid overdose.

Introduction

Opioid medications remain a mainstay of treatment of severe pain. In the setting of the modern opioid overdose and death epidemic, use of such medications has decreased, but there were still 168.9 million opioid prescriptions in the US in 2018.1 Each prescription of an opioid to a previously opioid-naive patient creates the potential for the development of chronic opioid use and opioid use disorder.2 For this reason, multiple entities and states have produced opioid prescribing guidelines, such as the Centers for Disease Control and Prevention Guideline for Prescribing Opioids for Chronic Pain in 2016 and the Oregon Acute Opioid Prescribing Guidelines in 2018.3,4 The surgical literature has declared opioid dependence to be a never-event (along with use disorder and overdose)5 and as the most common surgical complication,6 affecting approximately 5% to 7% of patients who started a new episode of opioid use.7

An association exists between the characteristics of a patient’s first opioid prescription and long-term use. Shah et al8 discovered that the number of days’ supply of the initial prescription was directly associated with the development of long-term use. Deyo et al9 found that greater morphine milligram equivalents (MMEs) of the initial prescription were associated with increased likelihood of developing long-term use. Previous work with the Ohio prescription drug monitoring program (PDMP) database found that different prescriber specialties had different rates of long-term use by patients10 likely because the indications of opioid use and underlying patient factors create different risks for long-term use.

Although chronic opioid use is an undesirable outcome, the most substantial harms from a new episode of prescription opioid use are overdose and death. Unfortunately, the ability to link prescribing decisions to specific outcomes has been hampered by data source limitations. Prescription drug monitoring programs describe the prescription filling patterns of patients with long-term use but do not contain data about drug indications, comorbid conditions, the patient’s environment, or intervening outcomes, such as opioid use disorder or overdose. Administrative claim files are useful because they capture prescriptions that are covered by insurance but may be incomplete if the prescriptions are paid in cash or if patients change insurers. In addition, the most serious outcome of interest, opioid-related death, is often found not in PDMP or insurance data but in vital records.

To address data source limitations and provide a more comprehensive analysis of risk factors after the initiation of opioid therapy, we combined claims data with several public health data sets (including All Payer All Claims Data [APCD], vital records, PDMP, and hospital discharge data) to create the Oregon Comprehensive Opioid Risk Registry.11 Using the Comprehensive Opioid Risk Registry, we constructed a retrospective cohort of opioid-naive patients who received an initial opioid prescription to assess the patient factors and early time-varying prescription-related factors associated with opioid-related fatal or nonfatal overdose. We hypothesized that some patient factors (eg, insurance type and high disease burden) and prescription factors in the first 6 months after opioid initiation (eg, high doses, and opioid and benzodiazepine overlap) are associated with increased risk of opioid overdose.

READ ON

Factors Associated With Opioid Overdose After an Initial Opioid Prescription | Addiction Medicine | JAMA Network Open | JAMA Network

USA

Most physicians paid by volume, despite push for quality and value


Study examines physicians in group practices owned by health systems


Peer-Reviewed Publication

RAND CORPORATION

Despite efforts by insurance companies and other payers to move toward compensating physicians based on the quality and value of care they provide, most physicians employed in group practices owned by health systems are paid primarily based on the volume of care they provide, according to a new study by RAND Corporation researchers.

 

Examining a wide range of medical practices owned by health systems, researchers found that volume-based compensation was the most-common type of base pay for more than 80% of primary care physicians and for more than 90% of physician specialists.

 

While financial incentives for quality and cost performance were commonly used by health systems, the percentage of total physician compensation based on quality and cost was modest --  9% for primary care providers and 5% for specialists.

 

The findings are published by the journal JAMA Health Forum.

 

“Despite growth in value-based programs and the need to improve value in health care, physician compensation arrangements in health systems do not currently emphasize value,” said Rachel O. Reid, the study’s lead author and a physician policy researcher at RAND, a nonprofit research organization. “The payment systems that are most-often in place are designed to maximize health system revenue by incentivizing providers within the system to deliver more services.”

 

In recent years, both private and public payers have adopted payment reforms that seek to encourage health care providers to improve the quality of care delivered and slow spending growth in an effort to generate better value for patients. At the same time, the size of health systems and their employment of physicians has increased markedly.

 

To examine whether the compensation structure for physicians resembled the payment reforms focused on value, the study examined the physician payment structures used in 31 physician organizations affiliated with 22 health systems located in four states.

 

Researchers interviewed physician organization leaders, reviewed compensation documents, and surveyed the physician practice to characterize the compensation arrangements of primary care and specialist physicians.

 

Increasing the volume of services delivered was the most commonly reported action that physicians can take to increase their compensation, with 70% of the practices following such a plan. In these cases, volume-based incentives accounted for more than two-thirds of compensation.

 

Performance-based financial incentives for value-oriented goals, such as clinical quality, cost, patient experience and access to care, were commonly included in compensation. But those payments represented only a small fraction of total compensation for primary care physicians and specialists, and are thus likely to only marginally affect physician behavior.

 

Instead, 70% of physician organization leaders noted that increasing the volume of services delivered is the top action that primary care and specialist physicians could take to increase their compensation.

 

“For the U.S. health care system to truly realize the potential of value-based payment reform and deliver better value for patients, health systems and provider organizations will likely need to evolve the way that frontline physicians are paid to better align with value,” Reid said. 

 

The study was conducted though the RAND Center of Excellence for Health Care Performance with funding provided by the Agency for Healthcare Research and Quality.

 

Other authors of the study are Ashlyn K. Tom, Rachel M. Ross, Erin L. Duffy and Cheryl L. Damberg.

 

RAND Health Care promotes healthier societies by improving health care systems in the United States and other countries.

 

NASA planes fly into snowstorms to study snowfall


Business Announcement

NASA/GODDARD SPACE FLIGHT CENTER

Mid-Atlantic Snow Band 

IMAGE: ON JAN. 4, 2022, THE MODIS INSTRUMENT ABOARD NASA’S TERRA SATELLITE CAPTURED THIS IMAGE OF SNOWFALL AFTER A LARGE STORM DUMPED WET, HEAVY SNOW ACROSS THE MID-ATLANTIC REGION OF THE UNITED STATES. SOME AREAS ACCUMULATED OVER 14 INCHES, SHUTTING DOWN BUSINESSES, SCHOOLS, AND INTERSTATE HIGHWAYS. view more 

CREDIT: NASA

Scientists repeatedly check the weather forecasts as they prepare aircraft for flight and perform last-minute checks on science instruments. There’s a large winter storm rolling in, but that’s exactly what these storm-chasing scientists are hoping for.

The team is tracking storms across the Midwest and Eastern United States in two NASA planes equipped with scientific instruments to help understand the inner workings of winter storms as they form and develop. The team is flying two aircraft to investigate winter storms, one above the storm and one within the clouds. Each is equipped with a suite of scientific instruments to collect data about snow particles and the conditions in which they form. The experiments are part of the second deployment of NASA’s Investigation of Microphysics and Precipitation for Atlantic Coast-Threatening Storms (IMPACTS) mission, which began in January and is planned to wrap up at the end of February.

This data will help the team relate properties of the snow particles and their environment to large-scale processes ­– such as the structure of clouds and precipitation patterns – that can be seen with remote sensing instruments on aircraft and satellites. Ultimately, what the IMPACTS team learns about snowstorms will improve meteorological models and our ability to use satellite data to predict how much snow will fall and where.

Surveying a Variety of Storms

Storms often form narrow structures called snow bands, said Lynn McMurdie, principal investigator for IMPACTS and an atmospheric scientist at the University of Washington in Seattle. One of the main goals of IMPACTS is to understand how these structures form, why some storms don’t have snow bands, and how snow bands can be used to predict snowfall. To do this, the team hopes to sample a wide variety of storms throughout the three-year IMPACTS campaign.

During the 2020 IMPACTS campaign, the team sampled a variety of storms in the Midwest and East Coast, including warmer rainstorms and storms with strong cold fronts and convection. But McMurdie says the team didn’t see a Nor’easter, a storm with a strong low-pressure system that moves up the New England coast and mixes moisture from the Atlantic Ocean with cold air from Canada. 

Nor’easters come up the East Coast and can dump several feet of snow, effectively shutting down cities, said John Yorks, one of the deputy principal investigators for IMPACTS at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. Being better able to predict where these storms will bring snow and how much could help cities better prepare for severe winter weather.

Above, Below and Into the Clouds

NASA and its partners have several satellites that measure precipitation from space, such as the Global Precipitation Measurement mission that observes rain and snow around most of the world every three hours. “But satellites can’t tell us a lot about the particles – the actual snowflakes ­– and where they form within the clouds,” said Gerry Heymsfield, one of the deputy principal investigators for IMPACTS at Goddard. IMPACTS is run out of NASA’s Wallops Flight Facility in Virginia, which is managed by Goddard.

Instead, IMPACTS is flying two aircraft outfitted with scientific instruments. The NASA Armstrong Flight Research Center’s ER-2, a high-altitude jet flying out of the Pope Army Airfield near Fayetteville, North Carolina, will fly at about 65,000 feet to get a top-down view from above the clouds. The instruments aboard the ER-2 are similar to those on satellites but with higher spatial resolution, additional measurement capabilities and more frequent sampling. Scientists on the ground are also measuring cloud properties from below using ground-based radars.

“A project like IMPACTS can really complement those spacecraft measurements with aircraft measurements that are higher resolution, higher accuracy, sample an event more frequently, and provide additional parameters such as Doppler measurements,” said Yorks.

The other aircraft, the P-3 Orion based out of Wallops, flies at altitudes up to 26,000 feet. Probes hanging off the P-3’s wings measure the size, shape and distribution of precipitation particles. Flying the P-3 at different altitudes allows the team to measure snow particles throughout the cloud, and the temperature, water vapor, and other conditions in which they form.

The P-3 also drops small instruments, called dropsondes, over the ocean. These instruments work like weather balloons in reverse, measuring temperature, wind and humidity in the atmosphere as they fall. The team is also launching weather balloons every few hours as the storm passes overhead from several sites that move depending on which storm the team is studying. The data collected by the dropsondes and weather balloons provide information about the atmospheric conditions before, during and after the storm.

“Snowstorms are really complicated storms, and we need every piece of data – models, aircraft instruments, meteorological soundings – to really figure out what’s going on within these storms,” said Heymsfield. 

The multi-year IMPACTS campaign is the first comprehensive study of snowstorms across the Eastern United States in 30 years. The science team includes researchers from NASA, several universities across the country, the National Center for Atmospheric Research, and NOAA, including partners at the National Weather Service.

To learn more about the mission, visit: https://espo.nasa.gov/impacts/content/IMPACTS

The future of US corn, soybean and wheat production depends on sustainable groundwater use


Peer-Reviewed Publication

DARTMOUTH COLLEGE

Spatial distribution of production loss by agricultural district from sustainable groundwater scenarios with varying recharge rates for corn (top map), soybean (middle map), and winter wheat (bottom map). 

IMAGE: SPATIAL DISTRIBUTION OF PRODUCTION LOSS BY AGRICULTURAL DISTRICT FROM SUSTAINABLE GROUNDWATER SCENARIOS WITH VARYING RECHARGE RATES FOR CORN (TOP MAP), SOYBEAN (MIDDLE MAP), AND WINTER WHEAT (BOTTOM MAP). RED, ORANGE, YELLOW, AND CHARTREUSE DOTS SHOW PRODUCTION LOST BY AGRICULTURAL DISTRICT FOR SUSTAINABLE GROUNDWATER USE SCENARIOS BASED ON 100% RECHARGE, 75% – 100% RECHARGE, 50% – 75% RECHARGE, AND 25% – 50% RECHARGE, RESPECTIVELY. GREEN DOTS SHOW SUSTAINABLE PRODUCTION. view more 

CREDIT: FIGURE BY J. LOPEZ ET AL. AND J. CHIPMAN.

In the U.S., 52% of irrigated land is used for corn, soybean and winter wheat production. Corn and soybean are two of the country’s most important crops, with 17% of corn production and 12% of soybean production coming from irrigated areas. However, the water used for this irrigation is often unsustainably pumped groundwater.  According to a recent Dartmouth-led study published in Earth’s Future, using groundwater sustainably for agriculture in the U.S. could dramatically reduce the production of corn, soybean and winter wheat. 

Irrigation relies on extracting groundwater from aquifers, which also serve as a source of drinking water and are essential to lakes, rivers and ecosystems. Aquifers are naturally recharged, as rainfall, snowmelt and other water infiltrate the soil, and are collected in a porous layer underground. If groundwater use, however, exceeds aquifer recharge rates, this reduces the amount of groundwater that is available in the aquifer, including for growing crops.

“Our findings underscore how corn, soybean and winter wheat production could be affected if we chose to stop depleting aquifers across the United States,” says co-lead author Jonathan Winter, an associate professor of geography and principal investigator of the Applied Hydroclimatology Group at Dartmouth. “However, future precipitation, which affects groundwater resources, is difficult to predict, and improved irrigation technology, more water-efficient crops, and better agricultural water management could reduce the production losses from a transition to sustainable groundwater use.”

To analyze the impacts of sustainable groundwater use for irrigated corn, soybean and winter wheat, researchers used a crop model to simulate irrigated agriculture from 2008 to 2012. The crop model uses information about daily weather, soil properties, farm management and crop varieties, and was compared to survey data from the U.S. Department of Agriculture to confirm its accuracy.

Crop production was simulated under four different groundwater use scenarios, ranging from most optimistic to pessimistic. The most optimistic scenario assumes that the maximum amount of recharge can be used for irrigation. The less optimistic scenarios, which are based on safe aquifer yield, assume that only a fraction of the recharge goes into the aquifer and just that restricted amount of water can be used for irrigation.  The less optimistic scenarios account for uncertainty in groundwater availability as well as preserving some water to maintain healthy ecosystems.  The four sustainable groundwater use scenarios are based on safe aquifer yields of 100%, 75%, 50%, and 25%.

Under the most optimistic sustainable groundwater use scenario, U.S. irrigated production of corn, soybean and winter wheat is reduced by 20%, 6% and 25%, respectively. Under the most pessimistic scenario, corn, soybean and winter wheat production is reduced by 45%, 37% and 36%, respectively.

“Our findings underscore how corn, soybean and winter wheat production could be affected if we chose to stop depleting aquifers across the United States,” says co-lead author Jonathan Winter, an associate professor of geography and principal investigator of the Applied Hydroclimatology Group at Dartmouth. “However, future precipitation, which affects groundwater resources, is difficult to predict, and improved irrigation technology, more water-efficient crops, and better agricultural water management could reduce the production losses from a transition to sustainable groundwater use.”

The findings show that Nebraska, Kansas, and Texas, which rely on groundwater from the High Plains Aquifer (also known as the Ogallala Aquifer) to grow corn, soybeans and winter wheat, would experience some of the greatest production losses as a result of sustainable groundwater use. This region is particularly vulnerable due to its lack of rainfall, which limits rainfed agriculture and groundwater recharge. Prior research found that the High Plains extracts three times as much groundwater as its aquifer’s recharge rate.

Central California, which relies on the Central Valley Aquifer, would also have large production losses in corn and winter wheat from sustainable groundwater use, but production losses of corn and winter wheat in California are limited due to the dominance of specialty crops, such as almonds, grapes and lettuce, which limit the amount of land used to grow corn and winter wheat.

In contrast, the Mississippi Valley, a significant corn and soybean region, would experience relatively few production losses, as groundwater extraction is typically less than recharge over the region. The Midwest would also experience minimal corn and soybean production losses because the region is humid and relies mainly on rain-fed rather than irrigated agriculture.

“Sustainable groundwater use is critical to maintaining irrigated agricultural production, especially in a global food system that is already taxed by climate change, population growth and shifting dietary demands,” says co-lead author Jose R. Lopez, a former postdoctoral researcher in geography at Dartmouth. “We need to expand the implementation of water conservation strategies and technologies we have now and develop more tools that can stabilize the nation's groundwater supply while preserving crop yields and farmer livelihoods.”

###

Winter is available for comment at: Jonathan.M.Winter@dartmouth.edu. In addition to Winter and Lopez, Joshua Elliott at the University of Chicago, Alex Ruane at NASA Goddard Institute for Space Studies, Cheryl Porter and Gerrit Hoogenboom at the University of Florida, Martha Anderson at the USDA ARS, and Christopher Hain at NASA Marshall Space Flight Center, also served as co-authors of the study.

Human disturbance is the most crucial factor for lynx in habitat selection


Peer-Reviewed Publication

UNIVERSITY OF FREIBURG

Habitat selection in wildlife is a process that occurs at different scales: Balancing advantages, such as high abundance of food, with disadvantages, such as human disturbance. Large predators, with their large spatial requirements, are particularly sensitive to these disturbances. A team led by conservation biologists Prof. Dr. Marco Heurich and Joseph Premier from the Faculty of Environment and Natural Resources at the University of Freiburg has studied this habitat selection process in Eurasian lynx. Their results, published by the researchers in Biological Conservation, provide important information for the conservation of this species in human-dominated landscapes. “Through this study, we can generalize the habitat selection behavior of a large carnivore species on a continental scale fort he first time,” explains Heurich.

Large dataset with animals in several European areas

The researchers led by Heurich and Premier used a data set consisting of tracking data on 125 lynx from nine study areas across Europe. They compared the locations available to and actually used by the predators at two scales: the landscape scale, which shows how lynx place their home range in the landscape, and the home range scale, which shows how lynx select the habitats within their home range. For this comparison, the research team used a novel machine learning approach called the random forest. This was extended to include a random effect so that variability within and between study areas could be accounted for.

What the animals avoid and how they orient themselves

On the landscape scale the analysis revealed that lynx avoid roads and human settlements. On the level of their home range, the animals were oriented towards hiding places and the availability of prey. The researchers found only minor differences between female and male lynx in their choice of habitat.

Heurich and Premier found the greatest differences in lynx habitat choice at the landscape level, where there were clear differences between the various study areas, for example between the Swiss Alps and the plains of Estonia. Within the foraging areas, lynx behaved very similarly throughout Europe, preferring heterogeneous forest areas and areas that provided protection from human disturbance.

Original publication:
Ripari. L., Premier; J. et al., Heurich, M (2022): Human disturbance is the most limiting factor driving habitat selection of a large carnivore throughout Continental Europe. In: Biological Conservation 266 (2022) 109446. DOI: 10.1016/j.biocon.2021.109446

Contact:
Prof. Dr. Marco Heurich
Chair for Wildlife Ecology and Management
University of Freiburg
Tel.: +49 (0)162/1301448
E-Mail: marco.heurich@wildlife.uni-freiburg.de

Annette Kollefrath-Persch
Office of University and Science Communications
University of Freiburg
Tel.: +49 (0)761 / 203-8909
E-Mail: annette.persch@pr.uni-freiburg.de

Data from thousands of cameras confirms protected areas promote mammal diversity


Peer-Reviewed Publication

UNIVERSITY OF BRITISH COLUMBIA

A black bear pictured near Fort McMurray in Alberta 

IMAGE: A BLACK BEAR PICTURED NEAR FORT MCMURRAY IN ALBERTA view more 

CREDIT: PHOTO BY COLE BURTON /UBC FACULTY OF FORESTRY

A new University of British Columbia study offers new evidence that protected areas are effective at conserving wildlife.

Researchers at UBC’s faculty of forestry analyzed data from a global data set drawing from 8,671 camera trap stations spanning four continents. They found more mammal diversity in survey areas where habitat had a protected designation—compared to forests and other wilderness areas that lacked that designation.

This was true even when these protected areas experienced human disturbances such as recreational use and logging.

“This is not shocking news in itself, but it is exciting evidence of the critical role that parks and nature reserves play in wildlife conservation,” says Dr. Cole Burton (he/him), the study’s senior author and a conservation biologist who researches mammal populations and human-wildlife coexistence.

“As international discussions continue on new global targets for expanding protected areas, it’s important to be able to measure the benefits of the protections that do currently exist.”

This is the largest number of wildlife cameras ever analyzed in a single study, notes first author Cheng Chen (he/him), a forestry PhD student who relied on two international wildlife camera databases for his analysis.

“By drawing on such a wide dataset, we were able to synthesize 91 camera trap surveys from 23 countries to come up with a global picture of the effect of protected areas on mammal diversity,” said Chen.

The study analyzed for the presence of a wide range of mammal species, from caribou in Canada to leopard cat in China.
 

Protected areas are the final strongholds of many endangered mammals, notes Burton, adding that mammals are a particularly challenging group to protect because they require large areas for habitat, and so tend to come into conflict with people.

“If we want to keep larger mammals around, along with the critical roles they play in ecosystems, we need to continue focusing on the growth of the protected area network,” said Burton. “In fact, under the Convention on Biological Diversity, the world is currently discussing new targets for how much  of the earth’s surface should be covered by parks. We need to have better information to inform these policy discussions. Hopefully this study helps fill the gaps in our knowledge.”

An international team of collaborators contributed to the research, published this week in Conservation Letters.

Interview languages: English (Burton, Chen), Mandarin (Chen)

For wildlife images: Google drive