Monday, October 17, 2022

 

Fighting cancer on earth and in space using high-energy protons

High-energy proton experiments optimize production of medical imaging isotopes while providing insight into how to protect astronauts from space radiation

DOE/US DEPARTMENT OF ENERGY


The Science

Scientists on Earth use high-energy protons to create isotopes to detect and treat cancer. In space, however, these same high-energy protons can pose a risk to spacecraft and the health of the astronauts traveling in them. These risks mean spacecraft must have protective shielding. Unfortunately, scientists have a great deal of uncertainty concerning the risks posed by these high-energy protons. To learn more about the risks and about using these protons to produce isotopes, scientists measured the cross sections (probabilities) for high-energy proton reactions used to produce important new radiopharmaceuticals. 

The Impact

Measuring these cross sections allows scientists to optimize the quantity and purity of medical isotopes needed in the fight against cancer. These include producing the pharmaceuticals arsenic-72-HBED and gallium-68-DOTATOC using high-energy protons from particle accelerators. Doctors use these pharmaceuticals to show where tumors have spread in the body. However, high-energy protons are also a dangerous type of space radiation. Better understanding their behavior and reactions allows scientists to improve the design of shielding that protects astronauts and electronics in spacecraft. This will allow us to explore other planets in our solar system, while also helping understand how nuclei absorb and release energy. 

Summary

Researchers from Brookhaven National Laboratory, Los Alamos National Laboratory, and Lawrence Berkeley National Laboratory performed a series of experiments using proton beams from the Brookhaven Linac Isotope Producer, the Los Alamos Isotope Production Facility, and the Berkeley Laboratory 88-Inch Cyclotron. These experiments measured production rates for 78 sotopes from the proton bombardment of niobium and arsenic targets with energies up to 200 MeV, including two radionuclides used for Positron Emission Tomography (PET) and one used to monitor the dose from proton beams in accelerators. 

The team compared this large body of production data to predictions made using state-of-the-art nuclear reaction modeling codes to explore how quickly the energy from an incoming proton spreads throughout the entire nucleus when the two collide. The results showed a significantly slower rate of energy dissipation in the nucleus compared to what scientists previously believed. This decrease, in turn, leads to an increase in the emission of additional “secondary” high-energy protons and neutrons, and also shows strikingly different rates when producing isotopes with different proton-to-neutron ratios. Taken together, these results provide guidance for the optimized production of the two PET radionuclides and will help in the design of shielding for spacecraft and their occupants from “secondary” particles caused by protons in cosmic radiation. 

 

Funding

This research was supported by the Department of Energy Isotope Program, managed by the Office of Science for Isotope R&D and Production, and was carried out by Lawrence Berkeley National Laboratory, Los Alamos National Laboratory, and Brookhaven National Laboratory.

Bringing custom microbes to the business of recycling plastic

Peer-Reviewed Publication

DOE/OAK RIDGE NATIONAL LABORATORY

Adam Guss 

IMAGE: ORNL’S ADAM GUSS AND COLLEAGUES USED SYNTHETIC BIOLOGY TO DEVELOP A CUSTOM MICROBE CAPABLE OF CONVERTING DECONSTRUCTED MIXED PLASTIC WASTE INTO VALUABLE NEW MATERIALS. view more 

CREDIT: CARLOS JONES, ORNL/U.S. DEPT. OF ENERGY

Scientists working on a solution for plastic waste have developed a two-step chemical and biological process to break down and upcycle mixed plastics into valuable bioproducts.

The project, which involves multiple institutions, draws on synthetic biology expertise at the Department of Energy’s Oak Ridge National Laboratory to engineer a microbe that converts deconstructed plastic waste into building blocks for next-generation materials.

The new process, described in the journal Science, would replace a system that now requires painstaking, costly sorting of materials, which has resulted in only about 5% of plastics being recycled in the United States.

The project is led by the National Renewable Energy Laboratory and also brings together scientists from  the Massachusetts Institute of Technology, the University of Wisconsin-Madison and ORNL under the Bio-Optimized Technologies to keep Thermoplastics out of Landfills and the Environment, or BOTTLE, Consortium.

Different plastics contain different polymers, each with unique chemical building blocks. The BOTTLE researchers developed a process to convert mixed plastics to a single chemical product, working toward a solution that would allow recyclers to skip sorting.

The first step in the process relies on oxygen and catalysts to break down large polymer molecules into their smaller chemical building blocks. The process was applied to a mixture of three common plastics: polystyrene, or PS, used in disposable coffee cups; polyethylene terephthalate, or PET, used in single-use beverage bottles, polyester clothing and carpets; and high-density polyethylene, or HDPE, used in many common consumer plastics and often associated with milk jugs.

“This is a potential entry point into processing plastics that cannot be recycled at all today,” said Gregg Beckham, a senior research fellow at NREL and head of BOTTLE.

The oxidation process breaks down these plastics into a complex mixture of chemical compounds — including benzoic acid, terephthalic acid and dicarboxylic acids — that would require advanced and costly separations to yield pure products. That is where biology comes into play.

BOTTLE colleagues engineered a soil microbe, Pseudomonas putida, to biologically convert or “funnel” the mixture of small-molecule intermediates to single products: either polyhydroxyalkanoates, or PHAs, which are an emerging form of biodegradable bioplastics; or beta-ketoadipate, which can be used to make new performance-advantaged nylon materials.

The experiment built on a process developed by ORNL’s Adam Guss and colleagues at NREL to engineer the bacterium with desired traits from other organisms. The process, outlined in the journal Metabolic Engineering, converted deconstructed PET into building blocks for a superior nylon product that is more water- and heat-resistant — ideal for applications such as automotive parts.

“We took a combinatorial approach to pathway assembly, basically finding the best combination of genes from different organisms that allowed us to get robust utilization of PET in Pseudomonas putida,” Guss said. “ORNL specializes in modifying nonmodel microbes to add traits useful for biotechnology, tapping our deep expertise in synthetic biology as well as transcriptomics and proteomics to discover new metabolic pathways.”

“Biological funneling simply means we’ve engineered the metabolic network of a microbe to direct the carbon from a large number of substrates to a single product,” said NREL’s Allison Werner, a co-author on the Science paper. “To do this, we take DNA from nature — usually other microbes — and paste it into Pseudomonas putida’s genome. The DNA is transcribed into RNA, which in turn is translated into proteins that perform diverse biochemical transformations, forming a new metabolic network and ultimately enabling us to capture more carbon and to tune where it goes.”

Guss and colleagues have spent years perfecting P. putida to convert the plant biopolymer lignin, derived from bioenergy crops, into advanced bioproducts as part of DOE’s Center for Bioenergy Innovation and Agile BioFoundry. In 2020, Guss led a team that announced it had engineered the microbe to simultaneously digest five of the most abundant compounds of lignocellulosic biomass.

In the next steps for BOTTLE, “we’re continuing to expand the range of molecules that P. putida can eat as we work to break down more types of plastics and also more real-world plastics that have additional additives,” Guss said.

“Plastics are major environmental pollutants and are largely made using fossil carbon,” he said. “This research sits at that intersection of breaking down today’s plastic waste and converting it into building blocks for the next generation of plastics that are both recyclable-by-design and biodegradable.”

Funding was provided by DOE’s Advanced Manufacturing Office and Bioenergy Technologies Office. The work was performed as part of the BOTTLE Consortium.

UT-Battelle manages ORNL for the Department of Energy’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

Editor’s note: This article was adapted from an NREL announcement.

Digital transformation in construction industry requires more support, study shows

Researchers identify the main obstacles preventing efficient digital transformation in the engineering and construction industry

Peer-Reviewed Publication

XI'AN JIAOTONG-LIVERPOOL UNIVERSITY

Digital transformation in the engineering and construction industry. 

IMAGE: IN A RECENT STUDY, RESEARCHERS HAVE IDENTIFIED THE MAIN OBSTACLES PREVENTING EFFICIENT DIGITAL TRANSFORMATION IN THE ENGINEERING AND CONSTRUCTION INDUSTRY. view more 

CREDIT: XJTLU

In recent years, the engineering and construction industries have been exploring the use of digital technologies to boost productivity and improve safety, quality, and sustainability. However, digital transformation in this industry has been slow compared to other sectors due to certain obstacles.

In a paper recently published in the journal Engineering, Construction, and Architectural Management, researchers from Xi’an Jiaotong-Liverpool University and the University of Lincoln identify, classify, and rank the main obstacles preventing digital transformation in this sector.

The researchers identified barriers through a systematic literature review and expert validation, then surveyed 192 construction professionals in China on the impact of the obstacles. They found that the three main problems are a lack of laws and regulations, a lack of support and leadership, and a lack of resources and professionals.

Changing legislation

The lack of laws and regulations is the most significant among the barriers identified, according to Dr Fangyu Guo, assistant professor at XJTLU’s Department of Civil Engineering and corresponding author of the paper.

“The lack of laws and regulations has a negative effect on digital transformation because construction companies need governmental regulations as guidelines to determine their strategies and adapt their organisational structure.

“Well-established standards and regulations are critical for directing an effective digital transformation and motivating stakeholders to invest more in various digital technologies and tools,” Guo says.

Leadership and resources

A lack of support and leadership also negatively influences digital transformation. Most employees may be unfamiliar with the new technologies, and insufficient support from senior management will reduce their willingness to adapt to the change.

“It is important for senior management in construction firms to give ongoing support and motivation to employees during the implementation of digital transformation,” says Dr Guo.

The lack of resources and professionals is another obstacle. There is currently a large digital skills gap in the industry, so developing the next generation of digital skills and relevant talents is crucial to digital transformation.

The researchers suggest firms must develop strategic plans and training programmes to obtain potential talents who can use digital technologies and possess long-term business intelligence.

The study offers insights for governments and construction companies to improve their understanding of these barriers and their impacts. The findings will help to establish policies that can use digital transformation to enhance construction project management.

The research team consists of Kaiyang Wang, Dr Fangyu Guo, and Dr Cheng Zhang from XJTLU, and Professor Dirk Schaefer from the University of Lincoln.

 

Back to the future of photosynthesis

Resurrecting billon-year-old enzymes reveals how photosynthesis adapted to the rise of oxygen

Peer-Reviewed Publication

MAX-PLANCK-GESELLSCHAFT

Rubisco complexes 

IMAGE: CRYO-ELECTRON MICROSCOPE IMAGE OF TWO RUBISCO COMPLEXES INTERACTING WITH EACH OTHER. IF A SUBUNIT ESSENTIAL FOR SOLUBILITY IS MISSING, INDIVIDUAL ENZYME COMPLEXES CAN INTERACT WITH EACH OTHER IN THIS WAY AND FORM THREAD-LIKE STRUCTURES, SO-CALLED FIBRILS. UNDER NORMAL CONDITIONS, HOWEVER, RUBISCO DOES NOT FORM SUCH FIBRILS. view more 

CREDIT: MPI F. TERRESTRIAL MICROBIOLOGY/ L. SCHULZ

The central biocatalyst in photosynthesis, Rubisco, is the most abundant enzyme on earth. By reconstructing billion-year-old enzymes, a team of Max Planck researchers has deciphered one of the key adaptations of early photosynthesis. Their results not only provide insights into the evolution of modern photosynthesis but also offer new impulses for improving it.

Present day life fully depends on photosynthetic organisms like plants and algae that capture and convert CO2. At the heart of these processes lies an enzyme called Rubisco that captures more than 400 billion tons CO2 annually. Organisms alive today make staggering amounts of it: the mass of Rubisco on our planet outweighs that of all humans. In order to assume such a dominant role in the global carbon cycle, Rubisco had to adapt constantly to changing environmental conditions.

Using a combination of computational and synthetic approaches, a team from the Max Planck Institute for Terrestrial Microbiology in Marburg, Germany, in collaboration with the University of Singapore has now successfully resurrected and studied billion-year-old enzymes in the lab. In this process, which they describe as “molecular paleontology”, the researchers found that instead of direct mutations in the active center, an entirely new component prepared photosynthesis to adapt to rising oxygen levels.

Rubisco's early confusion

Rubisco is ancient: it emerged approximately four billion years ago in primordial metabolism prior to the presence of oxygen on earth. However, with the invention of oxygen-producing photosynthesis and rise of oxygen in the atmosphere, the enzyme started catalyzing an undesired reaction, in which it mistakes O2 for CO2 and produces metabolites that are toxic to the cell. This confused substrate scope still scars Rubiscos to date and limits photosynthetic efficiency. Even though Rubiscos that evolved in oxygen-containing environments became more specific for CO2 over time, none of them could get completely rid of the oxygen capturing reaction.

The molecular determinants of increased CO2 specificity in Rubisco remain largely unknown. However, they are of great interest to researchers aiming to improve photosynthesis. Interestingly, those Rubiscos that show increased CO2 specificity recruited a novel protein component of unknown function. This component was suspected to be involved in increasing CO2 specificity, however, the true reason for its emergence remained difficult to determine because it already evolved billions of years ago.

Studying evolution by resurrecting ancient proteins in the lab

To understand this key event in the evolution of more specific Rubiscos, collaborators at the Max Planck Institute for Terrestrial Microbiology in Marburg and Nanyang Technological University in Singapore used a statistical algorithm to recreate forms of Rubiscos that existed billions of years ago, before oxygen levels began to rise. The team led by Max Planck researchers Tobias Erb and Georg Hochberg resurrected these ancient proteins in the lab to study their properties. In particular, the scientists wondered whether Rubisco’s new component had anything to do with the evolution of higher specificity.

The answer was surprising, as doctoral researcher Luca Schulz explains: “We expected the new component to somehow directly exclude oxygen from Rubisco catalytic center. That is not what happened. Instead, this new subunit seems to act as a modulator for evolution: recruitment of the subunit changed the effect that subsequent mutations had on Rubisco’s catalytic subunit. Previously inconsequential mutations suddenly had a huge effect on specificity when this new component was present. It seems that having this new subunit completely changed Rubisco’s evolutionary potential.”

An enzyme's addiction to its new subunit

This function as an “evolutionary modulator” also explains another mysterious aspect of the new protein component: Rubiscos that incorporated it are completely dependent on it, even though other forms of Rubisco can function perfectly well without. The same modulating effect explains why: When bound to this small protein component, Rubisco become tolerant to mutations that would otherwise be catastrophically detrimental. With the accumulation of such mutations, Rubisco effectively became addicted to its new subunit.

Altogether, the findings finally explain the reason why Rubisco kept this new protein component around ever since it encountered it. Max Planck Research Group Leader Georg Hochberg explains: “The fact that this connection was not understood until now highlights the importance of evolutionary analysis for understanding the biochemistry that drives life around us. The history of biomolecules like Rubisco can teach us so much about why they are the way they are today. And there are still so many biochemical phenomena whose evolutionary history we really have no idea about. So it’s a very exciting time to be an evolutionary biochemist: almost the entire molecular history of the cell is still waiting to be discovered.”

Scientific journeys back in time can provide invaluable insights for the future

The study also has important implications for how photosynthesis might be improved, says Max Planck Director Tobias Erb: “Our research taught us that traditional attempts to improve Rubisco might have been looking in the wrong place: for years, research focused solely on changing amino acids in Rubisco itself to improve it. Our work now suggests that adding entirely new protein components to the enzyme could be more productive and may open otherwise impossible evolutionary paths. This is uncharted land for enzyme engineering.”

Building the tools to make environmental data more accessible and forecasts more accurate

Virginia Tech ecological forecaster Quinn Thomas will lead Virginia Tech’s efforts in a newly-funded NSF project that will accelerate ongoing environmental research and benefit future researchers and scientific communities

Grant and Award Announcement

VIRGINIA TECH

Photo by Krista Timney for Virginia Tech. 

IMAGE: THE NATIONAL SCIENCE FOUNDATION-FUNDED DECODER PLATFORM WILL ACCELERATE WORK ON CURRENT VIRGINIA TECH FORECASTING PROJECTS RELATED TO CARBON STORAGE, WATER QUALITY, AND FALL COLORS, SUCH AS THIS LAKESHORE LANDSCAPE AT HUNGRY MOTHER STATE PARK IN MARION, VIRGINIA. PHOTO BY KRISTA TIMNEY FOR VIRGINIA TECH. view more 

CREDIT: VIRGINIA TECH

Before you start your next Google search, stop for just a minute. You may not know it, but whether you’re looking for the latest Hokie football score or cheap airline tickets, you’re about to unleash a powerful data discovery, retrieval, and organizing process made possible by the agreed-upon rules for defining information that drive search engines.

Now pause again and imagine if every website used a different set of rules and search engines weren’t available. Given the mind-boggling amount of information on the internet that is at our fingertips, how would you ever find what you need to make decisions and plan your life? Take that query up a couple notches for scientists navigating a plethora of environmental data scattered across the web, and you’ll understand the impact of a new Virginia Tech research project.

Quinn Thomas, who holds dual appointments in the College of Natural Resources and Environment and the College of Science, is the principal investigator for Virginia Tech’s part of a $3.2 million research project funded by the National Science Foundation’s (NSF) Office of Advanced Cyberinfrastructure.

The Democratized Cyberinfrastructure for Open Discovery to Enable Research (DeCODER) aims to standardize and facilitate how environmental data and model predictions are described and shared so that, ultimately, more individual researchers and scientific communities can utilize these resources.

Data is the key driver for the project and for the ecological forecasting research of Thomas, an associate professor in the Department of Forest Resource and Environmental Conservation, an affiliated faculty member of the Global Change Center, and the Data Science Faculty Fellow in the College of Science. He is a researcher with a bold objective: predicting the natural environment just like we predict the weather through the use of shared data tools and a computational infrastructure.

As the project lead for Virginia Tech, Thomas will put the university’s $535,000 share of the NSF grant to work to aid researchers interested in predicting environmental change. “My portion of the project is to advance the discoverability of ecological forecasts through the development of protocols and software to archive and document model predictions of ecological dynamics,” Thomas said. “Much like we use internet search engines (like Google) to find information, our work will help a researcher ask questions and initiate searches like ‘Find forecasts of algae in lakes across the U.S.’ in order to find current forecasts to help guide decision making and support environmental management.”  

This grant advances work already done on the EarthCube GeoCODES platform. EarthCube is an NSF-funded environment that improves access, sharing, and visualization of data. GeoCODES is a program specifically for researchers working in the field of geoscience that offers evolving methods for organizing data so it can be easily accessed, as well as a framework for new computational tools, a registry, and best practices for the user community.

The new DeCODER platform will build upon and leverage the work that has already been done as part of the EarthCube effort. Thomas will take the next steps to help researchers working specifically in ecological forecasting to more easily access data and create better models.

Again, considering the example of researchers needing to forecast algae growth across the U.S., the DeCODER platform will allow researchers to not only gather data and forecasts, but also to “then compare these predictions to actual measurements of algae to quantify the strengths and weakness of the forecasts that have been generated to date.”

In addition, Thomas said, “Rather than requiring all ecological forecasters to use a single archive location on the internet, the technology we are developing allows for many archiving locations to be used, thus democratizing the storage and discovery of the results of the forecasting expertise.”

This new platform is especially valuable to researchers, like Thomas, whose work involves utilizing data and modeling because it will allow them to more easily discover what has already been done in the field in order to improve models over time.

“Think about weather forecasts,” said Thomas. “They have been getting better over time. A 10-day forecast is as good as an eight- or nine-day forecast was a decade ago. We know this improvement has occurred by comparing past forecasts to data. Now we want to do this evaluation of other environmental forecasts, and we can’t do that if we can’t find all the historical data.”

He also said individual scientists are producing incredible amounts of data about the environment, but, unfortunately, it’s not all in one centralized place. This new technology will allow data to be discovered wherever it is and enable researchers to determine if they are getting better at forecasting environmental change.

Thomas will be working closely with Associate Professor Carl Boettiger of the University of California at Berkeley (UCB) on the application of the DeCODER platform to ecological forecasting. A primary focus will be developing the software and protocols that will allow people to discover needed data. “The DeCODER project will democratize research pipelines such as the production and assessment of ecological forecasts, helping to bridge scientific communities and better inform decision makers,” Boettiger said.

To meet this ambitious goal, the project involves a collaborative research effort between several teams with specific areas of expertise. In addition to the Virginia Tech and UCB focus on ecological forecasting data, the University of Illinois Urbana-Champaign (the lead institution) and the University of California San Diego (UCSD) are developing the cyberinfrastructure used by all the teams to tie the work together. Syracuse University and Texas A&M University are working on low-temperature geochemistry data, and the Scripps Institution of Oceanography, along with UCSD, is focusing on deep ocean science data.

According to Thomas, this newly-funded NSF grant project will both complement and advance the ongoing research agenda in ecological forecasting at Virginia Tech. The DeCODER platform will ultimately accelerate work on current forecasting projects related to water quality, forest carbon storage, fall colors, and environmental dynamics in the context of a changing environment.

“By focusing on a democratized approach to data and forecast discovery, the advances are designed to outlive the duration of the project. This places Virginia Tech’s ecological forecasting research at the vanguard of the field,” said Thomas.

Related stories:

 


Ecosystem-based fisheries management restores western Baltic fish stocks

First western Baltic Sea ecosystem model shows that ecosystem-based management increases catches of cod and herring as well as food web resilience to ocean warming

Peer-Reviewed Publication

HELMHOLTZ CENTRE FOR OCEAN RESEARCH KIEL (GEOMAR)

Decades of overfishing, together with nutrient pollution, rapid increase in hypoxia, ocean warming and acidification have put fish and harbour porpoise (Phocoena phocoena) in the western Baltic Sea at risk of collapse. But the commercially relevant stocks of cod (Gadus morhua), herring (Clupea harengus) and sprat (Sprattus sprattus) can be restored and prospects for marine mammals be improved, according to a team of marine scientists from GEOMAR Helmholtz Centre for Ocean Research Kiel (Germany), the German Federal Agency for Nature Conservation (Bundesamt für Naturschutz, BfN, Germany) and the Institute of Biosciences and Bioresources at the National Research Council (CNR) of Italy.

Using model simulations, the researchers tested five scenarios from no fishing to ecosystem-based fisheries management. This approach accounts for the roles of species within their ecosystem and adjusts catches accordingly to keep fish stocks in a healthy, productive and resilient condition. A study now published in the scientific magazine Frontiers in Marine Science concludes: Ecosystem-based fisheries management would allow the endangered harbour porpoise population to recover and increase catches of herring and cod significantly within a decade. The food web would become less susceptible to eutrophication and climate change and, in addition, more able to support carbon sequestration than in a business-as-usual scenario that assumes today’s fisheries practices continue.

The study benefits from years of data collection at GEOMAR. Building on a first prototype and a huge amount of data, the researchers now developed the first model for the western Baltic Sea that includes top predators such as harbour porpoise and seals, various fish species and other marine animals, plankton, algae and seaweeds, as well as their interactions under different scenarios. “Looking at the big picture of the food web helps to identify management options that sustain important food resources and dependant businesses”, emphasises Dr. Marco Scotti, marine ecologist at GEOMAR and CNR, lead author of the recent publication.

Ecosystem-based fisheries management would imply to stop catching juvenile cod, reduce catches of herring and sprat to half of the maximum sustainable yield – the highest possible harvest per year that can be sustained over time – and catches of adult cod and flatfish to 80 per cent of the maximum sustainable yield. This approach was compared to a business-as-usual scenario characterised by average fishing mortalities for all exploited stocks during the years 2015 to 2019.

In the business-as-usual scenario, cod stocks decline slightly below the numbers of 2019 and herring stocks to almost half their 2019 sizes by 2050. Sprat and flatfish increase to some degree, suggesting a substantial regime shift. Ecosystem-based fisheries management, in contrast, would lead to an increase of almost 70 per cent in cod and 50 per cent in herring catches by 2050, compared to the period 2015 to 2019. Catches in flatfish would increase by almost 20 per cent, but with largely reduced fishing effort and costs. Potential for carbon sequestration would be more than three times greater under ecosystem-based fisheries management compared to business-as-usual.

“The Common Fisheries Policy of Europe demands an end to overfishing by 2020 and rebuilding of healthy and resilient ecosystems thereafter”, says Dr. Rainer Froese, fisheries biologist at GEOMAR and co-author of the study. “Past and present overfishing – not climate change – was the main cause of the recent collapse of herring, cod, and profitable western Baltic fisheries in general. Continued business-as-usual would in addition push the highly endangered harbour porpoise to the brink of extinction. In contrast, ecosystem-based management would rebuild healthy stocks and fisheries and even help us to fight climate change. Saving the western Baltic requires to stop fishing of cod and herring for a few years, until these stocks have recovered. During this time, fishers need to be compensated for their losses. Fishing for plaice and other flatfish can meanwhile continue.”

Original publications:

Scotti M., Opitz S., MacNeil L., Kreutle A., Pusch C. and Froese R. (2022): Ecosystem-based fisheries management increases catch and carbon sequestration through recovery of exploited stocks: The western Baltic Sea case study. Frontiers in Marine Science 9, 879998. https://doi.org/10.3389/fmars.2022.879998

Opitz, S. and Froese R. (2019): Ecosystem Based Fisheries Management for the Western Baltic Sea. Extended Report. https://www.fishbase.de/rfroese/WBS_ComFish_Report_Final.pdf

Project funding:

The work has been funded by the German Federal Ministry of Education and Research through the Adaptation of the Western Baltic Coastal Fishery to Climate Change (balt_ADAPT) project (grant number 03F0863), the EcoScope Project of the Horizon 2020 Research and Innovation Programme of the European Commission (grant agreement number 101000302), as well as by the Federal Ministry of the Environment, Nature Conservation and Nuclear Safety (FKZ 3521532201).

 

ICYMI

Study reveals new insights into how fast-moving glaciers may contribute to sea level rise

Peer-Reviewed Publication

UNIVERSITY OF OXFORD

Greenland glacier 

IMAGE: ASSOCIATE PROFESSOR LAURA STEVENS (RIGHT) AND CO-AUTHOR PROFESSOR MEREDITH NETTLES (LEFT, COLUMBIA UNIVERSITY) APPROACH A GREENLAND SUPRAGLACIAL LAKE VIA HELICOPTER. PHOTO BY MARIANNE OKAL (UNAVCO, INC.). view more 

CREDIT: PHOTO BY MARIANNE OKAL (UNAVCO, INC.).

Climate change is resulting in sea level rise as ice on land melts and oceans expand. How much and how fast sea levels will rise in the near future will depend, in part, on the frequency of glacier calving events. These occur when large chunks of ice detach from glaciers that terminate in the ocean (known as tidewater glaciers), and fall into coastal fjords as icebergs. The faster these glaciers flow over the ground towards the ocean, the more ice enters the ocean, increasing the rate of sea level rise.

During the warmer summer months, the surface of Greenland’s glaciers can melt and form large lakes that may then drain through to the base of the glacier. Studies on the inland Greenland ice sheet have shown that this reduces friction between the ice and ground, causing the ice to slide faster for a few days. Up to now, however, it has been unclear whether such drainage events affect the flow speed of tidewater glaciers, and hence the rate of calving events.

To investigate this, a research team from Oxford University’s Earth Sciences department, the Oxford University Mathematical Institute, and Columbia University used Global Positioning System (GPS) observations of the flow speed of Helheim Glacier—the largest single-glacier contributor to sea level rise in Greenland. The GPS captured a near perfect natural experiment: high-temporal-resolution observations of the glacier’s flow response to lake drainage.

The results found that Helheim Glacier behaved very differently to the inland ice sheet, which shows a fast, downhill movement during lake drainage events. In contrast, Helheim Glacier exhibited a relatively small ‘pulse’ of movement where the glacier sped up for a short amount of time and then moved slower, resulting in no net increase in movement.

Using a numerical model of the subglacial drainage system, the researchers discovered that this observation was likely caused by Helheim glacier having an efficient system of channels and cavities along its bed. This allows the draining waters to be quickly evacuated from the glacier bed without causing an increase in the total net movement.

Although this appears positive news in terms of sea level rise implications, the researchers suspected that a different effect may occur for glaciers without an efficient drainage system where surface melt is currently low but will increase in future due to climate change (such as in Antarctica).

They ran a mathematical model based on the conditions of colder, Antarctic tidewater glaciers. The results indicated that lake drainages under these conditions would produce a net increase in glacier movement. This was largely due to the less efficient winter-time subglacial drainage system not being able to evacuate flood waters quickly. As of yet, however, there are no in situ observations of Antarctic tidewater glacier responses to lake drainage.

The study calls into question some common approaches for inferring glacial drainage systems based on glacier velocities recorded using satellite observations (which are currently used in sea level rise models).

Lead author Associate Professor Laura Stevens (Department of Earth Sciences, Oxford University) said: ‘What we've observed here at Helheim is that you can have a big input of meltwater into the drainage system during a lake drainage event, but that melt input doesn’t result in an appreciable change in glacier speed when you average over the week of the drainage event.’

With the highest temporal resolution of satellite-derived glacier speeds currently available being roughly one week, lake drainage events like the one captured in the Helheim GPS data usually go unnoticed.

‘These tidewater glaciers are tricky,’ Associate Professor Stevens added. ‘We have a lot more to learn about how meltwater drainage operates and modulates tidewater-glacier speeds before we can confidently model their future response to atmospheric and oceanic warming.’

Notes to editors:

The study ‘Tidewater-glacier response to supraglacial lake drainage’ is published in Nature Communications: https://www.nature.com/articles/s41467-022-33763-2.

For media inquiries contact Professor Laura Stevens, Earth Sciences Department, University of Oxford: laura.stevens@earth.ox.ac.uk.

About the University of Oxford

Oxford University has been placed number 1 in the Times Higher Education World University Rankings for the seventh year running, and ​number 2 in the QS World Rankings 2022. At the heart of this success are the twin-pillars of our ground-breaking research and innovation and our distinctive educational offer.

Oxford is world-famous for research and teaching excellence and home to some of the most talented people from across the globe. Our work helps the lives of millions, solving real-world problems through a huge network of partnerships and collaborations. The breadth and interdisciplinary nature of our research alongside our personalised approach to teaching sparks imaginative and inventive insights and solutions.

Through its research commercialisation arm, Oxford University Innovation, Oxford is the highest university patent filer in the UK and is ranked first in the UK for university spinouts, having created more than 200 new companies since 1988. Over a third of these companies have been created in the past three years. The university is a catalyst for prosperity in Oxfordshire and the United Kingdom, contributing £15.7 billion to the UK economy in 2018/19, and supports more than 28,000 full time jobs.