Saturday, May 18, 2024

 

Can we revolutionise the chemical industry and create a circular economy? Yes, with the help of catalysts



GRIFFITH UNIVERSITY
Catalysis in circular economy 

IMAGE: 

SUSTAINABLE RESOURCE MANAGEMENT IN THE CIRCULAR ECONOMY WITH EMPHASIS ON RESOURCE RECOVERY AND WASTE REDUCTION

view more 

CREDIT: ELSEVIER




The chemical industry is a cornerstone of global development, driving innovation, and providing essential products that support our modern way of life.  

However, its reliance on unsustainable fossil resources has posed significant threats to global ecosystems through climate change and chemical pollution.  

A new commentary published in Cell Press’ OneEarth co-authored by Griffith University researchers puts forth a transformative solution: catalysis to leverage sustainable waste resources, ushering the industry from a linear to a circular economy. 

“If we look at recent statistics, the chemical industry contributes a staggering US$5.7 trillion to the global economy and sustains 150 million jobs worldwide, excluding refined fossil fuels,” said Professor Karen Wilson, one of the lead authors and Director of Griffith’s Centre for Catalysis and Clean Energy

"But it remains the largest industrial energy consumer and the third-largest emitter of direct CO2 emissions globally.”  

In 2022, the industry emitted 935 million metric tons of CO2 during primary chemicals production. Moreover, its operations have led to significant water contamination and the release of toxic chemicals into the environment, perpetuating a cycle of ecological harm. 

Co-lead author Professor Adam Lee, also based at Griffith, said: “Catalytic processes could minimise reliance on finite fossil fuels and curb CO2 emissions significantly by harnessing agricultural, municipal, and plastic waste as feedstocks. 

“This feedstock transition not only mitigates environmental damage but also addresses vulnerabilities in the industry's supply chain, which are susceptible to geopolitical and natural disruptions.” 

Professor Wilson added: “Catalysis has historically played a key role in transforming fossil resources into essential fuels and products, and now offers a beacon of hope for revolutionising the chemical industry and promoting a circular economy.” 

However, the authors acknowledge that this vision demands concerted innovation in catalyst formulation and process integration.  

“Prioritising Earth-abundant elements over precious metals will unlock sustainable catalytic systems for the efficient conversion of organic waste into benign and recyclable products,” Professor Wilson said. 

“Already, pioneering initiatives such as the co-location of different industries in Kalundborg, Denmark to foster symbiosis have demonstrated new collaborative models to improve resource efficiency and waste reduction.” 

"Catalysis offers a pathway towards sustainability, enabling us to transform waste into valuable resources and pave the way for a circular economy," Professor Lee added. 

In the OneEarth commentary, the team explored sources of catalysis for sustainable and circular chemical processes through the following lenses: 

  • Catalysis to enable waste biomass utilisation 

  • Catalysis for circular polymers 

  • Catalysis to remediate chemical pollution 

The commentary ‘Catalysis at the Intersection of Sustainable Chemistry and a Circular Economy’ has been published in OneEarth.  

 

NCSA upgrades granite to expand availability to access, Illinois researchers


Granite’s capacity will increase 4.5 times and nearly double its performance thanks to a $500,000 NSF grant



Business Announcement

NATIONAL CENTER FOR SUPERCOMPUTING APPLICATIONS




As the capabilities of technology continue to expand and grow, so does the need for data storage. Researchers across the country and on the University of Illinois Urbana-Champaign campus continue to generate large datasets that need to be stored and shared long-term. New innovations in artificial intelligence and processing algorithms also mean that training and reprocessing of previously collected datasets is becoming more and more common.

The National Center for Supercomputing Applications understands these critical needs for researchers and, with a $500,000 grant from the U.S. National Science Foundation, will upgrade its primary active archive system, Granite.

Granite is a cutting-edge tape-based data subsystem that allows researchers to store and retrieve data in both traditional and newer, more flexible methods. It’s closely integrated with Taiga – NCSA’s global file system – to provide users with a place to store longer-term archive datasets.

The funding allows NCSA to upgrade Granite’s tape drives and media from the TS1140 tape drive to the latest-available LTO-9. The new hardware will provide 4.5 times greater density of the library and almost double the system’s performance.

Researchers continue to generate large volumes of data that need to be stored and shared for long periods of time. These upgrades allow researchers to store data for future reprocessing using new algorithms and allow data to be made available to other researchers for reproducibility purposes and to gain further insight.

J.D. Maloney, Senior Research Storage Engineer at NCSA and principal investigator of the award

“There is an urgent need to explore pragmatic data archiving strategies,” said University of Illinois Urbana-Champaign’s Research Data Service Director Heidi Imker. “In a world of infinite resources, all data would be just one click away. But that’s not feasible – or even necessary – for many datasets, especially large ones. This grant will allow us to explore alternative strategies and pilot what implementation would look like in practice.”

Imker and NCSA’s Associate Director of Integrated Cyberinfrastructure Tim Boerner are both co-principal investigators on the NSF funding for Granite, which will strengthen the ability for Illinois researchers and the national research community – through the NSF cyberinfrastructure program ACCESS – to store, share and utilize their data in the ways they need. Eighty percent of allocations on Granite will go to Illinois researchers and the remaining 20 percent will be allocated through ACCESS. Granite’s upgrades are expected to be completed by mid-summer 2024.


ABOUT NCSA

The National Center for Supercomputing Applications at the University of Illinois Urbana-Champaign provides supercomputing, expertise and advanced digital resources for the nation’s science enterprise. At NCSA, University of Illinois faculty, staff, students and collaborators from around the globe use innovative resources to address research challenges for the benefit of science and society. NCSA has been assisting many of the world’s industry giants for over 35 years by bringing industry, researchers and students together to solve grand challenges at rapid speed and scale.

ABOUT ACCESS

The Advanced Cyberinfrastructure Coordination Ecosystem: Services & Support (ACCESS) program is a U.S. National Science Foundation-funded virtual organization that facilitates cyberinfrastructure (CI) support for research. Through ACCESS, researchers can request time allocated from an extensive network of CI resources such as advanced computing, data resources and analysis, visualization, and storage. The ACCESS CI ecosystem is essential to computational and data-intensive research with a goal of providing seamless collaboration between resources and researchers.

 

STAR sees a magnetic imprint on deconfined nuclear matter


Data from heavy ion collisions give new insight into the electromagnetic properties of quark-gluon plasma “deconfined” from protons and neutrons




DOE/US DEPARTMENT OF ENERGY

STAR Sees a Magnetic Imprint on Deconfined Nuclear Matter 

IMAGE: 

COLLISIONS OF HEAVY IONS GENERATE AN IMMENSELY STRONG MAGNETIC FIELD THAT IN TURN INDUCES ELECTROMAGNETIC EFFECTS IN THE QUARK-GLUON PLASMA, A “SOUP” OF QUARKS AND GLUONS LIBERATED FROM THE COLLIDING PROTONS AND NEUTRONS.

view more 

CREDIT: IMAGE COURTESY OF TIFFANY BOWMAN AND JEN ABRAMOWITZ/BROOKHAVEN NATIONAL LABORATORY




The Science

Scientists have the first direct evidence that the powerful magnetic fields created in off-center collisions of atomic nuclei induce an electric current in “deconfined” nuclear matter. This is a plasma “soup” of quarks and gluons that have been set free, or “deconfined,” from nuclear matter—protons and neutrons—in the particle collisions. The magnetic fields in deconfined nuclear matter are a billion times stronger than a typical refrigerator magnet, but their effects can be hard to detect. This new study’s evidence is from measuring the way particles with an electric charge are deflected when they emerge from the collisions. The study provides proof that the powerful magnetic fields exist. It also offers a new way to measure the electrical conductivity in the quark-gluon plasma (QGP).

The Impact

Scientists can infer the value of the QGP’s electrical conductivity from how much the electromagnetic field deflects charged particles such as electrons, quarks, and protons. The stronger a particular type of deflection is, the stronger the conductivity. Conductivity is an important property of matter, but scientists have not been able to measure it in QGP before. Understanding the electromagnetic properties of the QGP may help physicists unravel the mysteries of the phase transition between QGP and ordinary nuclear matter made of protons and neutrons. The work will also aid in explorations of other magnetic effects in the QGP.

Summary

Off-center collisions of atomic nuclei at the Relativistic Heavy Ion Collider (RHIC), a Department of Energy particle accelerator user facility at Brookhaven National Laboratory, should generate powerful magnetic fields. That’s because some of the non-colliding positively charged protons are set swirling as the nuclei sideswipe one another at close to the speed of light. The fields are expected to be stronger than those of neutrons stars and much more powerful than Earth’s. But measuring magnetic fields in the QGP is challenging because this deconfined nuclear matter doesn’t last very long. So, instead, scientists measure the QGP’s properties indirectly, for example by using RHIC’s STAR detector to track the impact of the magnetic field on charged particles streaming from the collisions.

The STAR physicists saw a pattern of charged-particle deflection that could only be caused by an electromagnetic field and current induced in the QGP. This was clear evidence that the magnetic fields exist. The degree of deflection is directly related to the strength of the induced current. Scientists will now use this method to measure the conductivity of the QGP. That, in turn, may help them unravel mysteries of the phase transition between deconfined quarks and gluons and composite particles such as protons and neutrons.

 

Funding

This research was funded by the Department of Energy Office of Science, the National Science Foundation, and a range of international organizations and agencies listed in the scientific paper. The STAR team used computing resources at the Scientific Data and Computing Center at Brookhaven National Laboratory, the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory, and the Open Science Grid consortium.

 

Leading the way for Southeastern dairies


UTIA’s Center for Dairy Advancement and Sustainability to be hub for information



UNIVERSITY OF TENNESSEE INSTITUTE OF AGRICULTURE

Liz Eckelkamp, associate professor in the University of Tennessee Department of Animal Science 

IMAGE: 

UTIA'S NEW CENTER FOR DAIRY ADVANCEMENT AND SUSTAINABILITY (CDAS) IS LED BY LIZ ECKELKAMP, ASSOCIATE PROFESSOR OF ANIMAL SCIENCE AND EXTENSION DAIRY SPECIALIST. 

view more 

CREDIT: PHOTO BY H. HARBIN, COURTESY UTIA.





The Southeast’s dairy industry has faced many challenges over the past several decades, and the number of dairy farm numbers across the region has fallen to historic lows. Fewer than 2,000 are operating, with many teetering on the edge of profitability. In an effort to help local producers survive economically, as well as to keep their beloved dairy products available to consumers, the University of Tennessee Institute of Agriculture has formed the Center for Dairy Advancement and Sustainability (CDAS).

While the CDAS has been “open” for several months, now is the perfect time to highlight the center’s operations and goals as the nation begins celebrations for June Dairy Month.

The new center is led by Liz Eckelkamp, associate professor of animal science and Extension Dairy specialist. Eckelkamp has been the successful director of the Southest Dairy Business Innovation Initiative since 2020, which is a USDA-funded effort to revitalize the regional dairy industry through value-added products. The new CDAS will help with that and more. It is expected to be a hub of research, extension, and teaching with the goal of providing Real. Life. Solutions.™ to the Southeast dairy industry. Building on SDBII as well as additional funding, the CDAS is set to position UTIA as leader in innovative dairy solutions for both large and small ruminants, for example dairy cows and goats.

Eckelkamp, who also coordinates and provides curricula for the Tennessee Master Dairy Program (utdairy.tennessee.edu/master-dairy), says dairy farmers in the Southeast are faced with high costs of production. “On average they lose $6 for every hundred pounds of milk they sell commercially. In the face of these elevated costs and to thin negative profit margins, producers and processors must find alternative ways to remain in business and support their families, employees and communities,” she stresses. “The CDAS will focus on facing these challenges head on through alternatives revenue sources such as farmstead creameries, alternative inputs such as non-traditional forages, technologies to reduce labor costs and access issues, marketing and outreach tools and leadership training for existing dairy businesses.”

Center members consist of UTIA and other UT Knoxville faculty with the expertise to support the advancement and sustainability of the dairy industry through a dairy systems approach that includes topics under the areas of animal husbandry, precision dairy technologies, dairy foods, labor management, leadership, marketing, branding, and small business economics. Eckelkamp says, “We envision establishing UTIA as a leader in dairy precision technology and alternative income strategies addressing the needs of farmers, processors and producer/processors through Extension, research and teaching with University and industry synergistic partnerships.”

To raise awareness and grow the industry, CDAS will also provide opportunities to students through internships, conference travel, and research experiences. Center members will also have the opportunity to apply for seed grant funding through CDAS to answer pressing questions for the dairy industry.

Neal Schrick, professor and head of the UT Department Head of Animal Science, remarks, “Dr. Eckelkamp and her team have developed the Center for Dairy Advancement and Sustainability to provide focus and additional resources toward providing solutions for southeast dairy producers to help them overcome the many obstacles in their path toward sustainability of the industry. With Dr. Eckelkamp at the helm of the center and with the team she has organized across the Southeast, I see great opportunities across many difference facets of the dairy industry in the knowledge and initiatives that CDAS will provide.”

For more information, contact Eckelkamp at eeckelka@tennessee.edu or visit the UT Dairy website (utdairy.tennessee.edu) or the SBDII website (sdbii.tennessee.edu).

The University of Tennessee Institute of Agriculture is comprised of the Herbert College of Agriculture, UT College of Veterinary Medicine, UT AgResearch and UT Extension. Through its land-grant mission of teaching, research and outreach, the Institute touches lives and provides Real. Life. Solutions. to Tennesseans and beyond. utia.tennessee.edu.

 

 

NIH study shows chronic wasting disease unlikely to move from animals to people



Study of cerebral organoids reinforces evidence for substantial species barrier




NIH/NATIONAL INSTITUTE OF ALLERGY AND INFECTIOUS DISEASES

Human Cerebral Organoids 

IMAGE: 

A RESEARCHER HOLDS A FLASK CONTAINING HUMAN CEREBRAL ORGANOIDS SIMILAR TO THOSE USED IN THE CWD STUDY

view more 

CREDIT: NIAID




WHAT:
A new study of prion diseases, using a human cerebral organoid model, suggests there is a substantial species barrier preventing transmission of chronic wasting disease (CWD) from cervids—deer, elk and moose—to people. The findings, from National Institutes of Health scientists and published in Emerging Infectious Diseases, are consistent with decades of similar research in animal models at the NIH’s National Institute of Allergy and Infectious Diseases (NIAID).

Prion diseases are degenerative diseases found in some mammals. These diseases primarily involve deterioration of the brain but also can affect the eyes and other organs. Disease and death occur when abnormal proteins fold, clump together, recruit other prion proteins to do the same, and eventually destroy the central nervous system. Currently, there are no preventive or therapeutic treatments for prion diseases.

CWD is a type of prion disease found in cervids, which are popular game animals. While CWD has never been found in people, a question about its transmission potential has lingered for decades: Can people who eat meat from CWD-infected cervids develop prion disease? The question is important because during the mid-1980s and mid-1990s a different prion disease – bovine spongiform encephalopathy (BSE), or mad cow disease – emerged in cattle in the United Kingdom (U.K.) and cases also were detected in cattle in other countries, including the United States. Over the next decade, 178 people in the U.K. who were thought to have eaten BSE-infected beef developed a new form of a human prion disease, variant Creutzfeldt-Jakob Disease, and died. Researchers later determined that the disease had spread among cattle through feed tainted with infectious prion protein. The disease transmission path from feed to cattle to people terrified U.K. residents and put the world on alert for other prion diseases transmitted from animals to people, including CWD. CWD is the most transmissible of the prion disease family, showing highly efficient transmission between cervids.

Historically, scientists have used mice, hamsters, squirrel monkeys and cynomolgus macaques to mimic prion diseases in people, sometimes monitoring animals for signs of CWD for more than a decade. In 2019, NIAID scientists at Rocky Mountain Laboratories in Hamilton, Montana, developed a human cerebral organoid model of Creutzfeldt-Jakob Disease to evaluate potential treatments and to study specific human prion diseases. 

Human cerebral organoids are small spheres of human brain cells ranging in size from a poppy seed to a pea. Scientists grow organoids in dishes from human skin cells. The organization, structure, and electrical signaling of cerebral organoids are similar to brain tissue. They are currently the closest available laboratory model to the human brain. Because organoids can survive in a controlled environment for months, scientists use them to study nervous system diseases over time. Cerebral organoids have been used as models to study other diseases, such as Zika virus infection, Alzheimer’s disease, and Down syndrome.

In the new CWD study, the bulk of which was done in 2022 and 2023, the research team validated the study model by successfully infecting human cerebral organoids with human CJD prions (positive control). Then, using the same laboratory conditions, they directly exposed healthy human cerebral organoids for seven days with high concentrations of CWD prions from white-tailed deer, mule deer, elk, and normal brain matter (negative control). The researchers then observed the organoids for up to six months, and none became infected with CWD.

This indicates that even following direct exposure of human central nervous system tissues to CWD prions there is a substantial resistance or barrier to the propagation of infection, according to researchers. The authors acknowledge the limitations of their research, including the possibility that a small number of people may have genetic susceptibility that was not accounted for, and that emergence of new strains with a lesser barrier to infection remains possible. They are optimistic that the inference of these current data is that humans are extremely unlikely to contract a prion disease because of inadvertently eating CWD-infected cervid meat. 

ARTICLES:
B Groveman and K Williams et alLack of Transmission of Chronic Wasting Disease Prions to Human Cerebral OrganoidsEmerging Infectious Diseases DOI: 10.3201/eid3006.231568 (2024).

B Groveman and NC Ferreira et al. Human Cerebral Organoids as a Therapeutic Drug Screening Model for Creutzfeldt-Jakob Disease. Scientific Reports DOI: 10.1038/s41598-021-84689-6 (2021).

B Race et al. Lack of Transmission of Chronic Wasting Disease to Cynomolgus Macaques. Journal of Virology DOI: 10.1128/JVI.00550-18 (2018).

B Race et al. Susceptibilities of Nonhuman Primates to Chronic Wasting Disease. Emerging Infectious Diseases DOI: 10.3201/eid1509.090253 (2009).


NIAID conducts and supports research—at NIH, throughout the United States, and worldwide—to study the causes of infectious and immune-mediated diseases, and to develop better means of preventing, diagnosing and treating these illnesses. News releases, fact sheets and other NIAID-related materials are available on the NIAID website.  

About the National Institutes of Health (NIH): NIH, the nation's medical research agency, includes 27 Institutes and Centers and is a component of the U.S. Department of Health and Human Services. NIH is the primary federal agency conducting and supporting basic, clinical, and translational medical research, and is investigating the causes, treatments, and cures for both common and rare diseases. For more information about NIH and its programs, visit http://www.nih.gov/.

NIH...Turning Discovery Into Health®


The top frames show the validated study model, where CJD brain matter (in red) infected the organoid. The bottom frames show the uninfected organoid after exposure to infectious CWD matter.

CREDIT

NIAID

 

Scientists develop new geochemical ‘fingerprint’ to trace contaminants in fertilizer



Heavy metals pollution traced back to mineral-based fertilizers



Peer-Reviewed Publication

DUKE UNIVERSITY

Phosphate Rock and Fertilizer 

IMAGE: 

A PERSON'S HANDS CUP A SAMPLE OF PELLETIZED AGRICULTURAL FERTLIZER AND A PIECE OF THE PHOSPHATE ROCK FROM WHICH IT IS CREATED.

view more 

CREDIT: ROBERT HILL, DUKE UNIVERSITY





DURHAM N.C. – An international team of scientists has uncovered toxic metals in mineral phosphate fertilizers worldwide by using a new tool to identify the spread and impact of such contaminants on soil, water resources, and food supply.

“While mineral phosphate fertilizers are critical to boost global sustainable agriculture and food security, we found high levels of toxic metals in many fertilizers worldwide,” said Avner Vengosh, chair of the Earth and Climate Sciences division at Duke University’s Nicholas School of the Environment. “Our study developed a new method to identify sources and impacts of these metals on the environment.” Those metals included cadmium, uranium, arsenic, vanadium, and chromium.

Use of mineral fertilizer – synthetic or naturally occurring substances with essential nutrients needed for plant growth – has helped boost sustainable crop yields worldwide. But until recently, its contamination with toxic metals has not been systematically evaluated. This new study analyzes global phosphate fertilizers from major phosphate-mining countries.

“We measured strontium isotopes in both phosphate rocks and fertilizers generated from those rocks to show how fertilizers’ isotope ‘fingerprint’ matches their original source,” said Robert Hill, the study’s lead author and a PhD student at Duke University.

Isotopes are variations of an element, in this case strontium. Chemical analysis of each fertilizer shows a unique isotope mix that matches phosphate rocks from where it was sourced.

“Given variations of strontium isotopes in global phosphate rocks, we have established a unique tool to detect fertilizers’ potential impact worldwide,” Hill said.

To learn whether strontium isotopes are a reliable indicator of trace elements in fertilizer worldwide, researchers analyzed 76 phosphate rocks, the main source of phosphate fertilizers, and 40 fertilizers from major phosphate rock-producing regions including the western United States, China, India, North Africa and the Middle East. Researchers collected samples from mines, commercial sources, and Tidewater Research Station, an experimental field in North Carolina. The research team published its findings on May 9, 2024 in Environmental Science & Technology Letters.

Metals found in soil and groundwater come from both naturally occurring and human-made sources.

“Strontium isotopes essentially are a ‘fingerprint’ that can reveal contamination in groundwater and soil worldwide,” said Vengosh. His research team has also used strontium isotopes to trace environmental contamination in landfill leaching, coal mining, coal ash, fracking fluids, and groundwater that is pulled to the surface with  oil and natural gas extraction.

“The isotope is a proxy to identify the source of contamination,” Vengosh said. “Without this tool, it is difficult to identify, contain, and remediate contamination linked to fertilizer.”

Fertilizers in the study showed different concentrations of trace elements, with higher levels observed in fertilizers from the U.S. and the Middle East compared to those from China and India. As a result, the researchers conclude that  phosphate fertilizers from the U.S. and the Middle East will have a greater impact on soil quality due to their higher concentrations of uranium, cadmium, chromium as compared to fertilizers from China and India, which have higher concentrations of arsenic.

The National Science Foundation funded this study. (EAR-2305946)

CITATION: “Tracing the Environmental Effects of Mineral Fertilizer Application with Trace Elements and Strontium Isotope Variations,” Robert C. Hill, Gordon D. Z. Williams, Zhen Wang, Jun Hu, Tayel El-Hasan, Owen W. Duckworth, Ewald Schnug, Roland Bol, Anjali Singh, and Avner Vengosh. Environmental Science & Technology Letters, May 9, 2024. DOI: 10.1021/acs.estlett.4c00170

Online: https://doi.org/10.1021/acs.estlett.4c00170

 

Researchers confirm scale matters in determining vulnerability of freshwater fish to climate changes



Context matters when it comes to evaluating climate change sensitivity, Virginia Tech researchers found




VIRGINIA TECH

Samuel Silknetter, Meryl Mims 

IMAGE: 

(FROM LEFT) SAMUEL SILKNETTER AND MERYL MIMS PUBLISHED A PAPER IN ECOSPHERE ABOUT THE VULNERABILITY OF FRESHWATER FISH SPECIES TO CLIMATE CHANGE. PHOTO BY FELICIA SPENCER FOR VIRGINIA TECH.

view more 

CREDIT: PHOTO BY FELICIA SPENCER FOR VIRGINIA TECH.




The silver chub isn’t considered sensitive to climate change on a national scale, but context matters. For example, if climate change sensitivity is evaluated in only one region of the United States, the freshwater fish appears quite a bit more susceptible. 

“Relative to other species we looked at in the gulf region of the U.S., the silver chub occupied a pretty small geographic area,” said Samuel Silknetter, a Ph.D. student in biological sciences. “If we didn’t look at the climate sensitivity across multiple spatial scales, a regional analysis alone may miss the bigger context of why a species appears sensitive to climate change at some scales but not others, especially compared to other species.”

Silknetter and Associate Professor Meryl Mims recently led a team that explored the influence the spatial extent of research – the geographical coverage of data collected – has on evaluating the sensitivity of different fish species to climate change. The findings were published in Ecosphere.

“The spatial extent can be really relevant for specific cases, especially when you’ve got a species that is widespread but might be identified as more vulnerable in one region than another due to differences in distribution,” said Silknetter, who is also an affiliate of the Global Change Center’s interfaces of global change graduate program.

Using open-source data from the Global Biodiversity Information Facility and the U.S. Geological Survey, the team created a rarity and climate sensitivity (RCS) index for 137 freshwater fish species in the United States and then compared national scores with regional scores for each species. They found the relative sensitivity for species changed depending on spatial scale, and some species appeared more or less sensitive to climate change than the national index score indicated.

“Some species, like the elegant madtom, had high relative sensitivity across spatial extents yet had no state or federal conservation listings,” Silknetter said. “Our assessment identified some species with high relative sensitivity to climate change but no current protected status. These species can be targeted in future studies to determine whether they are truly at-risk species that have been previously overlooked.”

The research team sees relative climate change sensitivity rankings as another tool for conservation managers hoping to mitigate the effects of climate change.

“Some of the data we used dates back more than 100 years, providing information on historic as well as current distribution of freshwater fish species. But sometimes the data are few and far between, with only a few dozen documented occurrences for a species over that time period,” said Mims, an affiliate with the Global Change Center and the Fralin Life Sciences Institute. “The RCS index, which allows relative sensitivity rankings to be calculated from a range of data types, can enable direct comparisons of species that have wide-ranging data availability.”

In the hope of increasing the actionable nature of the findings, the research team has made the data accessible to anyone through the U.S. Geological Survey Science base

“Ensuring that our methods follow best practices for open science is really important if we’re going to be transparent in what we’re doing,” Silknetter said. “We need to be proactive in trying to identify vulnerable species early because at some point there are fewer options for a species if the damage has been done.”

Going forward, the team hopes that its approach for assessing the vulnerability of multiple species can spur on additional conservation efforts.

“The increasing availability of public occurrence and trait data will improve our ability to identify species sensitive to climate change,” Silknetter said. “I think approaches like ours will play an important role in shaping how future assessments consider spatial extent.”

 

How heatwaves are affecting Arctic phytoplankton



Experiments conducted at the AWIPEV Station in Svalbard on this increasingly common phenomenon



ALFRED WEGENER INSTITUTE, HELMHOLTZ CENTRE FOR POLAR AND MARINE RESEARCH

Sampling phytoplankton 

IMAGE: 

KLARA WOLF (LEFT) SAMPLES ACTIC PHYTOPLANKTON IN KONGSFJORDEN, NY ÅLESUND, SVALBARD.

view more 

CREDIT: ALFRED-WEGENER-INSTITUT / PAOLO VERZONE





The basis of the marine food web in the Arctic, the phytoplankton, responds to heatwaves much differently than to constantly elevated temperatures. This has been found by the first targeted experiments on the topic, which were recently conducted at the Alfred Wegener Institute’s AWIPEV Station. The phytoplankton’s behaviour primarily depends on the cooling phases after or between heatwaves, as shown in a study just released in the journal Science Advances.

Heatwaves, which we’ve increasingly seen around the globe in recent years, are also becoming more and more common in the Arctic. During a heatwave, not only the air but also the ocean grows warmer – the temperature is substantially higher than the seasonal mean value for at least five consecutive days. But how these short-term temperature fluctuations affect polar organisms remains largely unclear. To shed light on this aspect, a team led by Dr Klara Wolf (Universities of Hamburg and Konstanz) and Dr Björn Rost from the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI) has now used experiments to investigate how single-cell algae, the phytoplankton, respondact to these extreme events. Given the phytoplankton’s role as the basis of the marine food web, changes in it could resonate throughout the entire Arctic ecosystem.

In incubation experiments at the AWIPEV Station in Svalbard, the researchers allowed natural phytoplankton communities from nearby Kongsfjorden to grow for 20 days under various conditions – normal and increased but constant temperatures (2° C, 6° C, 9° C). For comparison, they subjected the phytoplankton to repeated heatwaves of varying intensity (6° C, 9° C) , each lasting five days with a three-day cooling phase at the seasonal mean temperature (2° C) in between. Different types of samples were collected at defined intervals in order to characterise the physiological responses and any potential species shifts.

“Under stable temperatures, even an extreme increase of +7° C led to accelerated growth and higher productivity, with surprisingly small changes in the composition of species, even over weeks,” says Klara Wolf regarding the experiments’ outcomes. “In contrast, the effects of heatwaves are considerably more complex and don’t follow the same pattern. This implies that our knowledge about constant temperature increases cannot readily be applied to these short-term warm phases, which normally only last a few days.” One reason for the difference is apparently that not just the exposure to increased temperatures has a major impact on productivity, but also and especially the cooling phases after or between heatwaves – and very little is known about these effects.

“We’re only just starting to gain a mechanistic understanding of how heatwaves can impact the polar regions,” says AWI biologist Björn Rost. “Our study represents an important first step and shows which aspects of heatwaves and which phytoplankton-related processes we need to take a closer look at. In addition, our study shows that what we know about the processes and effects of constantly higher temperatures can’t simply be applied one-to-one.” In fact, scenarios involving fluctuating temperatures can produce a broad range of effects, which is why predicting their implications is more complicated than for continuous warming.

Accordingly, in order to develop better projections and models regarding how primary production and the Arctic ecosystem will change in response to climate change, it won’t suffice to investigate the effects of mean temperatures; the effects of temperature fluctuations need to receive more attention. While stable warming up to a certain temperature increases productivity, some heatwaves can decrease it, while others increase it. A better grasp of the effects of variable temperatures, especially the cooling phases, is therefore essential to improving forecasts on potential biodiversity changes. Investigations on phytoplankton are hereby most crucial, since changes at the basis of the food web can impact all higher trophic levels, all the way up to fisheries.


Klara Wolf at the phytoplankton experiment in Ny Ålesund, Svalbard

CREDIT

Alfred-Wegener-Institut / Rene Bürgi