Thursday, December 05, 2024

 

Researchers assess the sustainability of the Pacific walrus population over the next 75 years




Wiley




The Pacific walrus, a critically important resource for Alaska and Chukotka Native communities, is subject to rapid habitat loss associated with climate change and increasing human activity in the Arctic. New research published in the Journal of Wildlife Management assessed the sustainability of varying degrees of Pacific walrus harvest to the end of the 21st century under different climate and human disturbance scenarios.

These scenarios ranged from optimistic to pessimistic, based largely on sea ice projections from general circulation models.

The study’s results indicated that current rates of Pacific walrus harvest are within a sustainable range and will continue to be under all scenarios considered—provided that the population abundance is assessed at regular intervals and that changes in harvest levels match changes in population dynamics. An annually consistent harvest level that doesn’t consider these population dynamics would not be sustainable.

“This modeling framework can be used by managers and stakeholders to explore future scenarios and promote the continued sustainability of the Pacific walrus population,” said corresponding author Devin L. Johnson, PhD, of the U.S. Fish and Wildlife Service.

URL upon publication: https://onlinelibrary.wiley.com/doi/10.1002/jwmg.22686

 

Additional Information
NOTE:
 The information contained in this release is protected by copyright. Please include journal attribution in all coverage. For more information or to obtain a PDF of any study, please contact: Sara Henning-Stout, newsroom@wiley.com.

About the Journal
The Journal of Wildlife Management publishes original research contributing to fundamental wildlife science. Topics encompass biology and ecology of wildlife and their habitats with implications for conservation or management.

About Wiley     
Wiley is one of the world’s largest publishers and a trusted leader in research and learning. Our industry-leading content, services, platforms, and knowledge networks are tailored to meet the evolving needs of our customers and partners, including researchers, students, instructors, professionals, institutions, and corporations. We empower knowledge-seekers to transform today’s biggest obstacles into tomorrow’s brightest opportunities. For more than two centuries, Wiley has been delivering on its timeless mission to unlock human potential. Visit us at Wiley.com. Follow us on FacebookXLinkedIn and Instagram.

 

Climate-ready crop


RIPE team shows increase in food mass through photorespiratory bypass in elevated temperatures



Carl R. Woese Institute for Genomic Biology, University of Illinois at Urbana-Champaign

Katherine Meacham-Hensold 

image: 

Katherine Meacham-Hensold led research that showed engineering improvements in the photorespiratory bypass of potato led to increased tuber mass in heatwave conditions.

view more 

Credit: RIPE Project/Katherine Meacham-Hensold




A team from the University of Illinois has engineered potato to be more resilient to global warming showing 30% increases in tuber mass under heatwave conditions. This adaptation may provide greater food security for families dependent on potatoes, as these are often the same areas where the changing climate has already affected multiple crop seasons.

“We need to produce crops that can withstand more frequent and intense heatwave events if we are going to meet the population’s need for food in regions most at risk from reduced yields due to global warming,” said Katherine Meacham-Hensold, scientific project manager for the Realizing Increased Photosynthetic Efficiency (RIPE) at Illinois. "The 30% increase in tuber mass observed in our field trials shows the promise of improving photosynthesis to enable climate-ready crops."

Meacham-Hensold led this work for RIPE, an international research project that aims to increase global food access to food by developing food crops that turn the sun’s energy into food more efficiently. RIPE was supported from 2017-2023 by the Bill & Melinda Gates FoundationFoundation for Food & Agriculture Research, and U.K. Foreign, Commonwealth & Development Office and is currently supported by Bill & Melinda Gates Agricultural Innovations (Gates Ag One).

The Challenge
Photorespiration is a photosynthetic process that has been shown to reduce the yield of soybean, rice, and vegetable crops by up to 40%. Photorespiration occurs when Rubisco reacts with an oxygen molecule rather than CO2, which occurs around 25% of the time under ideal conditions but more frequently in high temperatures. Plants then have to use a large amount of energy to metabolize the toxic byproduct caused by photorespiration (glycolate). Energy that could have been used for greater growth.

“Photorespiration is a large energy cost for the plant,” said Meacham-Hensold. “It takes away from food production as energy is diverted to metabolizing the toxin. Our goal was to reduce the amount of wasted energy by bypassing the plant’s original photorespiratory pathway.”

Previous RIPE team members had shown that by adding two new genes, glycolate dehydrogenase and malate synthase, to model plants’ pathways, they could improve photosynthetic efficiency. The new genetics would metabolize the toxin (glycolate) in the chloroplast, the leaf compartment responsible for photosynthesis, rather than needing to move it through other regions of the cell.

The solution
These energy savings drove growth gains in the model crop, which the current team hoped would translate to increased mass in their food crop. Not only did they see a difference, the benefits, recently published in Global Change Biology, were tripled under heatwave conditions, which are becoming more frequent and more intense as global warming progresses.

Three weeks into the 2022 field season, while the potatoes were still in their early vegetative growth phase, a heatwave kept temperatures above 95°F (35°C) for four straight days, breaking 100°F (38°C) twice. After a couple of days of reprieve the temperatures shot up near 100° again.

Rather than withering in the heat, the modified potatoes grew 30% more tubers than the control group potatoes, taking full advantage of their increased thermotolerance of photosynthetic efficiency.

“Another important feature of this study was the demonstration that our genetic engineering of photosynthesis that produced these yield increases had no impact on the nutritional quality of the potato,” said Don Ort, Robert Emerson Professor of Plant Biology and Crop Sciences and Deputy Director of the RIPE project. “Food security is not just about the amount of calories that can be produced but we must also consider the quality of the food.”

Multi-location field trials are needed to confirm the team’s findings in varying environments, but encouraging results in potatoes could mean similar results could be achieved in other root tuber crops like cassava, a staple food in Sub-Saharan African countries expected to be heavily impacted by rising global temperatures.

Researchers weren't sure how the potatoes would respond to the high temperatures, but harvest showed the modified potatoes grew 30% more mass than the control potatoes.

The potatoes were sliced and freeze dried before being ground for nutrient analysis.

Credit

RIPE Project/Claire Benjamin

 

ERC grant to explore how excessive confidence in science and technology is blocking the deep sustainability turn



Estonian Research Council
Laur Kanger, Associate Professor of Technology Research at the University of Tartu 

image: 

Laur Kanger, Associate Professor of Technology Research at the University of Tartu

view more 

Credit: Author of the photo: Andres Tennus




European Research Council has awarded a Consolidator Grant to Laur Kanger, Associate Professor of Technology Research at the University of Tartu, to investigate why future technologies can aggravate environmental problems rather than solve them.

During the Industrial Revolution, numerous perceptions, beliefs, norms and practices emerged in society, significantly accelerating the advancement of science and technology. Collectively, these are referred to as industrial modernity. According to Kanger, the concern is that the above perceptions and beliefs are not adapted to the current situation characterised by the unprecedented technological capability of humanity and scope of environmental problems. For example, industrial societies have tended to patch up issues arising from the use of existing technologies with new and more sophisticated technologies. However, doing the same for recent technologies like solar geoengineering or artificial intelligence would be a very risky approach to tackling the climate crisis because of the speed, scale and uncertainty of resulting impacts.

Three steps to identify the deep sustainability transition

Kanger’s research group will explore the problematic nature of industrial modernity in three ways. First, the evolution of beliefs, norms and behaviours concerning the environment, science and technology over the past 125 years is measured in the world’s 20 largest industrialised countries. As a novel method, digital newspaper archives will be analysed using large language models. In addition to media texts, various databases containing, among other things, legislation, scientific publications and patent information will be used for the analysis. “Our goal is to find out whether there have been major shifts in the perceptions about the environment, science and technology in industrial societies that could mark a rupture in industrial modernity,” Kanger explained.

Through case studies, the research team will then try to identify exactly how the conventional perceptions about science and technology are blocking sustainable solutions. “Over the past 40 years, for example, influential climate models have led to over-optimism about the future availability of nuclear power and carbon capture technology. This has allowed pressing climate policy decisions to be postponed further and further into the future,” Kanger said.

Thirdly, the researchers are trying to identify the countries least hindered by the legacy of industrial modernity and, therefore, having the greatest potential to enact a deep sustainability turn. Building on the team’s prior work, the project will develop an index for measuring this potential, combining existing databases with a novel methodological approach of analysing newspaper texts using large language models. This enables researchers to gain a faster and more comprehensive understanding of public opinion in different countries compared to other text mining techniques. Kanger is hopeful that the use of large language models, if successfully applied, will significantly speed up data analysis without sacrificing too much in accuracy.

19-century methods cannot solve today’s problems

Based on the results obtained, a new theory describing the emergence, maturation, and crisis of industrial modernity will be devised. “The problem of today’s society is that we are trying to address 21st-century environmental problems with 19th and 20th-century understandings of science and technology. The new framework will help us recognise these perceptions, learn from history, and possibly take them into 

account when designing sustainability interventions in the future,” Kanger said.

In addition to Laur Kanger, researchers from the fields of digital humanities, computational social science, transition studies and history are involved in the Rise and Demise of Industrial Modernity project. The project budget is 1.99 million euros, and the research period is five years.

The European Research Council (ERC) supports pioneering research in any field of research across Europe. The Consolidator Grant is awarded to top-level independent researchers to support their career development. ERC received 2,313 proposals in this call for proposals and awarded grants to 328 researchers from 25 countries.

 

How non-toxic and efficient solar cells can be produced



Linköping University
Rui Zhang 

image: 

Rui Zhang, researcher at the Department of Physics, Chemistry and Biology at LiU.

view more 

Credit: Thor Balkhed




Large-scale production of organic solar cells with high efficiency and minimal environmental impact. This can now be made possible through a new design principle developed at Linköping University, Sweden. In the study, published in the journal Nature Energy, the researchers have studied molecule shape and interaction in organic solar cells.

“With electrification and the development of AI, we will probably see a significant increase in the world’s energy needs. That electricity needs to come from environmentally sustainable sources if we are to slow down climate change at the same time,” says Feng Gao, professor of optoelectronics at LiU.

One green energy source in the focus of researchers globally is solar cells. As a complement to traditional silicon solar cells, several different alternative variants are being developed. One of the most promising technologies is based on electrically conductive plastics – organic electronics. 

The advantage of organic solar cells is that they are comparatively cheap and easy to manufacture. In addition, they are lightweight and flexible, which means that they could be placed on windows, indoors or on clothes to power personal electronics. Organic solar cells are already on the market today and their share is expected to increase.

The efficiency of organic solar cells is catching up with traditional solar cells and they can convert about 20 percent of the sun’s rays into electricity. The high efficiency is the result of several years of intensive materials research and studies of the interaction between the molecules in the material, the so-called morphology.

Organic solar cells are produced in a physical mixture which is then placed on a substrate and the solvent in the mixture evaporates. However, the chemical solution contains toxic and environmentally hazardous substances.

“To realise mass production of organic solar cells, with printed technologies for example, on a large scale, we need to find methods that don’t use toxins. Otherwise, it’s not good for the environment or for those working in the factories,” says Feng Gao.

His research team has now, together with colleagues in China and the United States, managed to crack the code for producing efficient organic solar cells with several different environmentally friendly solvents. 

“To choose the right solvent, it’s important to understand the entire solar cell manufacturing process. This includes knowing the initial structures of the solution, observing the dynamic processes during evaporation and checking the final structure of the solar cell film,” says Rui Zhang, researcher at the Department of Physics, Chemistry and Biology at LiU and lead author of the article published in Nature Energy.

What the Linköping researchers have done is map the molecular interaction between the materials transporting the electrons and the solvent itself by using a series of advanced synchrotron X-ray and neutron techniques. Thanks to this, the researchers were then able to develop a design principle that works for many different harmless solvents. In the long run, they hope that even water can act as a solvent.

According to the researchers, understanding the link between morphology and performance in organic solar cells is a major challenge, as they need to investigate the ultra-fast movement of electrons (the charge transport) from the material that releases electrons to the receiving material. Those processes occur within nanoscale structures and at molecular interfaces. According to Feng Gao, the road to environmentally sustainable organic solar cells is now open.

“Thanks to a toxin-free manufacturing method, we now have a much greater chance of commercialising the technology on a larger scale.”

The researchers from Linköping University have developed a new design principle for non-toxic solvents to be used in the manufacturing of organic solar cells.

Credit

Thor Balkhed

 

Research set to transform our understanding of how the ocean breathes



£2.5m project will provide unprecedented detail on ocean mixing and improve climate models



University of Southampton

Autonomous profiling float deployed in the ocean 

image: 

Autonomous profiling float deployed in the ocean

view more 

Credit: University of Southapton



A new £2.5 million project led by the University of Southampton and the National Oceanography Centre (NOC) is set to transform our understanding of how the ocean ‘breathes', storing heat and greenhouse gases from the atmosphere. 

Ocean scientists will deploy sensors onboard high-tech floats to provide unprecedented detail on how the ocean breathes through mixing —tiny turbulent movements that pull water, heat, and chemicals from its surface down into the deep.

This ventilation helps to regulate the Earth's climate, buffering against the impacts of human-induced climate change.

Mixing also plays a key role in regulating ocean current systems, such as the Atlantic meridional overturing circulation (AMOC).

“Small-scale mixing plays a crucial role in how the ocean exchanges carbon and heat with the atmosphere and stores it below the surface,” says Dr Bieito Fernandez Castro, a Lecturer in Physical Oceanography at the University of Southampton leading the project.

“Yet, much about this crucial process remains a mystery, so there’s a higher degree of uncertainty in our estimates than we’d like. It happens on such small scales (ranging from centimetres to kilometres) that it has been hard to measure, meaning current ocean and climate models fail to capture the intricate dynamics at work.”

The REMIX-TUNE project has been awarded £2.5 million from the European Research Council to deploy a cutting-edge fleet of autonomous floats in key regions of deep-water formation where much of the heat and carbon sequestration takes place - namely the North Atlantic and Southern Ocean.

Equipped with turbulence sensors and new highly efficient onboard computers, the floats will pass through the water column from the surface down to depths of up to two thousand meters and back up again over several years, capturing detailed local data on how water mixes at both the mesoscale (large eddies) and microscale (tiny, chaotic swirls).

Dr Fernandez Castro says: “These profiling floats have been used since the 2000s to measure the temperature and salinity of the ocean, as well as other properties, to help with forecasting and modelling.

“But they were incapable of observing mixing until now, so it’s exciting to deploy them in significant numbers for this purpose.”

The data captured will generate the first comprehensive, observation-based global database measuring mixing’s role in ocean ventilation.

This detailed new understanding will feed into the next generation of ocean-climate models, improving their ability to simulate the ocean’s role in storing heat and greenhouse gases.

Dr Alex Megann at the National Oceanography Centre, a co-investigator on the project, says: “Combining the new data with existing hydrographic profiles from the global Argo programme, we can reconstruct mixing over the past 25 years over the global ocean to provide much more accurate mixing estimates.

“We'll then use a model called NEMO, which is the ocean component of the UK’s contribution to the IPCC, to use our improved estimates of mixing to give a much clearer picture of how ocean ventilation regulates our climate.”

Ends

Contact

Steve Williams, Media Manager, University of Southampton, press@soton.ac.uk or 023 8059 3212.

Notes for editors

  1. For Interviews with Dr Bieito Fernandez Castro please contact Steve Williams, Media Manager, University of Southampton press@soton.ac.uk or 023 8059 3212.
  2. Images available here: https://safesend.soton.ac.uk/pickup?claimID=bp3yWwN7qJFhhgPg&claimPasscode=xHhbPhpvCEydsV4B
    • IMG-20231002-WA0008 – Autonomous profiling float deployed in the ocean
    • IMG-20231002-WA0013 - Dr Bieito Fernandez Castro stood next to an autonomous profiling float

Additional information

The University of Southampton drives original thinking, turns knowledge into action and impact, and creates solutions to the world’s challenges. We are among the top 100 institutions globally (QS World University Rankings 2025). Our academics are leaders in their fields, forging links with high-profile international businesses and organisations, and inspiring a 22,000-strong community of exceptional students, from over 135 countries worldwide. Through our high-quality education, the University helps students on a journey of discovery to realise their potential and join our global network of over 200,000 alumni. www.southampton.ac.uk

www.southampton.ac.uk/news/contact-press-team.page

Follow us on X: https://twitter.com/UoSMedia

 

Massive asteroid impacts did not change Earth’s climate in the long term




University College London
Microspherules 

image: 

Microscope image of silica droplets, or microspherules, found in the rock.

view more 

Credit: Natalie Cheng / Bridget Wade




Two massive asteroids hit Earth around 35.65 million years ago, but did not lead to any lasting changes in the Earth’s climate, according to a new study by UCL researchers.

The rocks, both several miles wide, hit Earth about 25,000 years apart, leaving the 60-mile (100km) Popigai crater in Siberia, Russia, and the 25-55 mile (40-85km) crater in the Chesapeake Bay, in the United States - the fourth and fifth largest known asteroid craters on Earth.

The new study, published in the journal Communications Earth & Environment, found no evidence of a lasting shift in climate in the 150,000 years that followed the impacts.

The researchers inferred the past climate by looking at isotopes (atom types) in the fossils of tiny, shelled organisms that lived in the sea or on the seafloor at the time. The pattern of isotopes reflects how warm the waters were when the organisms were alive.

Co-author Professor Bridget Wade (UCL Earth Sciences) said: “What is remarkable about our results is that there was no real change following the impacts. We expected the isotopes to shift in one direction or another, indicating warmer or cooler waters, but this did not happen. These large asteroid impacts occurred and, over the long term, our planet seemed to carry on as usual.

“However, our study would not have picked up shorter-term changes over tens or hundreds of years, as the samples were every 11,000 years. Over a human time scale, these asteroid impacts would be a disaster. They would create a massive shockwave and tsunami, there would be widespread fires, and large amounts of dust would be sent into the air, blocking out sunlight.

“Modelling studies of the larger Chicxulub impact, which killed off the dinosaurs, also suggest a shift in climate on a much smaller time scale of less than 25 years.

“So we still need to know what is coming and fund missions to prevent future collisions.”

The research team, including Professor Wade and MSc Geosciences student Natalie Cheng, analysed isotopes in over 1,500 fossils of single-celled organisms called foraminifera, both those that lived close to the surface of the ocean (planktonic foraminifera) and on the seafloor (benthic foraminifera).

These fossils ranged from 35.5 to 35.9 million years old and were found embedded within three metres of a rock core taken from underneath the Gulf of Mexico by the scientific Deep Sea Drilling Project.

The two major asteroids that hit during that time have been estimated to be 3-5 miles (5-8km) and 2-3 miles (3-5km) wide. The larger of the two, which created the Popigai crater, was about as wide as Everest is tall.

In addition to these two impacts, existing evidence suggests three smaller asteroids also hit Earth during this time – the late Eocene epoch – pointing to a disturbance in our solar system’s asteroid belt.

Previous investigations into the climate of the time had been inconclusive, the researchers noted, with some linking the asteroid impacts with accelerated cooling and others with episodes of warmer temperatures.

However, these studies were conducted at lower resolution, looking at samples at greater intervals than 11,000 years, and their analysis was more limited – for instance, only looking at species of benthic foraminifera that lived on the seafloor.

By using fossils that lived at different ocean depths, the new study provides a more complete picture of how the oceans responded to the impact events.

The researchers looked at carbon and oxygen isotopes in multiple species of planktonic and benthic foraminifera.

They found shifts in isotopes about 100,000 years prior to the two asteroid impacts, suggesting a warming of about 2 degrees C in the surface ocean and a 1 degree C cooling in deep water. But no shifts were found around the time of the impacts or afterwards.

Within the rock, the researchers also found evidence of the two major impacts in the form of thousands of tiny droplets of glass, or silica. These form after silica-containing rocks get vaporised by an asteroid. The silica end up in the atmosphere, but solidify into droplets as they cool.

Co-author and MSc Geosciences graduate Natalie Cheng said: “Given that the Chicxulub impact likely led to a major extinction event, we were curious to investigate whether what appeared as a series of sizeable asteroid impacts during the Eocene also caused long-lasting climate changes. We were surprised to discover that there were no significant climate responses to these impacts.

“It was fascinating to read Earth's climate history from the chemistry preserved in microfossils. It was especially interesting to work with our selection of foraminifera species and discover beautiful specimens of microspherules along the way.”

The study received funding from the UK’s Natural Environment Research Council (NERC).


Microscope image of silica droplets, or microspherules, found in the rock, this time cropped to be a landscape image and with a plain black background.

Credit

Natalie Cheng / Bridget Wade

 

On the trail of the 2011 mega earthquake




MARUM - Center for Marine Environmental Sciences, University of Bremen
The scientific drilling vessel CHIKYU 

image: 

The scientific drilling vessel CHIKYU.

view more 

Credit: Photo: JAMSTEC



Since the beginning of September and until December 20 of this year, a team of international researchers has been underway on the Japanese research drilling vessel CHIKYU to get to the bottom of the causes of the great Tōhoku earthquake of 2011 through deep-sea drilling.

It is the second expedition to this region. Just 13 months after the earthquake, during the IODP Expedition 343 “Japan Trench Fast Drilling Project” (JFAST) in 2012, the researchers drilled through the plate boundary. The recovered core showed a striking change at the plate boundary, a subduction zone where the Pacific Plate is subducting beneath the Eurasian Plate. An installed temperature observatory showed a signature of frictional heat from the earthquake.

Twelve years after the first IODP expedition to the Japan Trench, the aim of IODP Expedition 405 is now to determine the properties, processes and conditions within the subduction zones – and how these have changed since JFAST. These promote strong sliding in the trenches and can contribute to the formation of large tsunamis. During the expedition, physical data from the boreholes will be recorded, cores will be taken and analyzed on board the CHIKYU and observatories will be installed.

The researchers are split into two teams, each living and working on the CHIKYU for about two months. Each scientist has a defined role on board in the so-called core flow and in the upcoming investigations and analyses. In the first half of the expedition, Dr. Matt Ikari from MARUM was on board as team leader for the physical properties’ specialists. His team is concerned with the geophysical measurement of drill core samples. This includes water content, porosity, thermal conductivity, elastic wave speed and shear strength.

MARUM scientist Dr. Junli Zhang is currently on board the CHIKYU on the second leg of the journey. He is part of the geochemistry team, which takes samples, carries out geochemical measurements, analyzes data and contributes to the reports, as do all the researchers on board. 

As is usual with IODP expeditions, the drill cores are analyzed and sampled according to defined standards, and the same applies to the data, which is then made available to all expedition participants and later to the entire scientific community.

In addition, the researchers pursue their own research questions, which they also want to answer using the samples and data obtained. “After the expedition, in collaboration with other science party members and a few external partners, I plan to make geotechnical laboratory measurements on the recovered cores, with an important aspect being the preservation of the condition of the recovered rock/sediment to be as it was in the earth.  That way, we can get a far more accurate view of how faults which produce huge earthquakes actually move,” explains Matt Ikari.

Junli Zhang will focus on pore pressure and fluid flow processes in the subduction zone of the Japan Trench, as well as interactions between water and rock, in his post-cruise research. Speaking about his experience on the CHIKYU so far, he shared, “The expedition has been going very smoothly. When I got on the ship, we were continuing with the drilling operations. In the first few days, I went through a quick handover and received intensive training on core flow, sampling methods, and geochemical measurements, among other things. This was very helpful and allowed me to quickly settle into my duties.”

 

Background information on the expedition:

The expedition is conducted by the Japan Agency for Marine-Earth Science and Technology (JAMSTEC). It involves 56 scientists from ten countries who will work on board the CHIKYU in two expedition phases.

The expedition aims to answer the following scientific questions:

(1) The current state of stress accumulation around the fault zone after more than ten years after the earthquake.
(2) The structure of the fault that caused the earthquake, its physical characteristics, and the factors that control the slip behavior.
(3) The effects of fluids on the stress state around the fault zone.


Matt Ikari (Physical Property, right), Hiroki Sone (Structure geology) are working on whole round core sampling.

Credit

Photo: JAMSTEC/IODP

More information:

  • Expedition website: https://www.jamstec.go.jp/chikyu/e/exp405/index.html and background: https://youtu.be/noEBFf2x7xU 
  • JAMSTEC press release: https://www.jamstec.go.jp/e/about/press_release/20240829_2/
  • Article on the expedition in “Japan Forward”: https://japan-forward.com/aboard-the-chikyu-searching-for-new-earthquake-clues-deep-in-the-japan-trench/
  • Information on the J-FAST expedition: https://www.jamstec.go.jp/chikyu/e/exp343/expedition.html
  • Press release of the MARUM expedition to the Japan Trench 2012: https://www.marum.de/en/Uncovering-the-Traces-of-the-Great-Tohoku-Earthquake.html

MARUM produces fundamental scientific knowledge about the role of the ocean and the seafloor in the total Earth system. The dynamics of the oceans and the seabed significantly impact the entire Earth system through the interaction of geological, physical, biological and chemical processes. These influence both the climate and the global carbon cycle, resulting in the creation of unique biological systems. MARUM is committed to fundamental and unbiased research in the interests of society, the marine environment, and in accordance with the sustainability goals of the United Nations. It publishes its quality-assured scientific data to make it publicly available. MARUM informs the public about new discoveries in the marine environment and provides practical knowledge through its dialogue with society. MARUM cooperation with companies and industrial partners is carried out in accordance with its goal of protecting the marine environment.