Wednesday, February 15, 2023

Before global warming, was the Earth cooling down or heating up?

Peer-Reviewed Publication

NORTHERN ARIZONA UNIVERSITY

Accurate climate models play a critical role in climate science and policy, helping to inform policy- and decision-makers throughout the world as they consider ways to slow the deadly effects of a warming planet and to adapt to changes already in progress.

To test their accuracy, models are programmed to simulate past climate to see if they agree with the geologic evidence. The model simulations can conflict with the evidence. How can we know which is correct?

A review article published today in Nature addresses this conflict between models and evidence, known as the Holocene global temperature conundrum. Lead author Darrell Kaufman, a Regents’ professor in the School of Earth and Sustainability, and University of Arizona postdoctoral researcher Ellie Broadman, a co-author who worked on this study while earning her Ph.D. at NAU, analyzed a broad swath of available data from the last 12,000 years to break down the conundrum. The study builds on work Kaufman did that was included in the latest major climate report by the Intergovernmental Panel on Climate Change (IPCC) and looks at whether the global average temperature 6,500 years ago was warmer, as indicated by proxy evidence from natural archives of past climate information, or colder, as simulated by models, in comparison to the late 19th century, when the Industrial Revolution led to a significant increase in human-caused warming.

This comprehensive assessment concludes that the global average temperature about 6,500 years ago was likely warmer and was followed by a multi-millennial cooling trend that ended in the 1800s. But, they cautioned, uncertainty still exists despite recent studies that claimed to have resolved the conundrum.

“Quantifying the average temperature of the earth during the past, when some places were warming while others were cooling, is challenging, and more research is needed to firmly resolve the conundrum,” Kaufman said. “But tracing changes in global average temperature is important because it’s the same metric used to gauge the march of human-caused warming and to identify internationally negotiated targets to limit it. In particular, our review revealed how surprisingly little we know about slow-moving climate variability, including forces now set into motion by humans that will play out as sea level rises and permafrost thaws over coming millennia.”

What we know

We know more about the climate of the Holocene, which began after the last major ice age ended 12,000 years ago, than any other multi-millennial period. There are published studies from a variety of natural archives that store information about historical changes that occurred in the atmosphere, oceans, cryosphere and on land; studies that look at the forces that drove past climate changes, such as Earth’s orbit, solar irradiance, volcanic eruptions and greenhouse gases; and climate model simulations that translate those forces into changing global temperatures. All these types of studies were included in this review.

The challenge up to now has been that our two significant lines of evidence point in opposite directions. Paleo-environmental “proxy” data, which includes evidence from oceans, lakes, and other natural archives, point to a peak global average temperature about 6,500 years ago and then a global cooling trend until humans started burning fossil fuels. Climate models generally show global average temperatures increasing in the last 6,500 years.

If the proxy data are correct, that points to deficiencies in the models and specifically suggests that climate feedbacks that can amplify global warming are underrepresented. If the climate models are correct, then the tools for reconstructing paleotemperatures need to be sharpened.

We also know that, whether the numbers trend up or down, the change in global average temperature in the past 6,500 years has been gradual—probably less than 1 degree Celsius (1.8 degrees Fahrenheit). This is less than the warming already measured in the last 100 years, most of which humans have caused. However, because global temperature change of any magnitude is significant, especially in response to changing greenhouse gases, knowing whether temperatures were higher or lower 6,500 years ago is important to our knowledge of the climate system and improving forecasts of future climate.

What we don’t know

This study highlighted uncertainties in the climate models. If the authors’ preferred interpretation—that recent global warming was preceded by 6,500 years of global cooling—is correct, then scientists’ understanding of natural climate forcings and feedbacks, and how they are represented in models, needs improvement. If they’re incorrect, then scientists need to improve their understanding of the temperature signal in proxy records and further develop analytical tools to capture these trends on a global scale.

Attempting to resolve the Holocene global temperature conundrum has been a priority for climate scientists in the last decade; Broadman remembers reading the initial paper on this topic when she started her Ph.D. in 2016. All the studies since have added to the understanding of this issue, which gets scientists in the field closer to a comprehensive understanding. Recent studies on this topic have tried adjusting proxy data to account for their presumed weaknesses, inserting plausible forcings into climate models and blending proxy data with climate-model output, all arriving at different conclusions about the cause of the conundrum. This review takes a step back to revisit the issue with a comprehensive global-scale assessment, showing that we don’t yet know the solution to this conundrum.

Developing widely applicable methods of quantifying past temperature is a high priority for climate scientists already. For example, Kaufman’s lab is testing the use of chemical reactions involving amino acids preserved in lake sediment as a new method for studying past temperature changes. Combined with new radiocarbon dating technology from the Arizona Climate and Ecosystem lab at NAU, this technique could help determine whether global warming reversed a long-term cooling trend.

Why it matters

Broadman, whose work includes a focus on science communication, created the figures that accompany the research. This is a critical way of communicating hard-to-understand results to audiences—and in climate science, the audiences are diverse and include educators, policymakers, nonprofits and scientists throughout the world.

“One interesting takeaway is that our findings demonstrate the impact that regional changes can have on global average temperature. Environmental changes in some regions of the Earth, like declining Arctic sea ice or changing vegetation cover in what are now vast deserts, can cause feedbacks that influence the planet as a whole,” Broadman said. “With current global warming, we already see some regions changing very quickly. Our work highlights that some of those regional changes and feedbacks are really important to understand and capture in climate models.”

Additionally, Kaufman said, accurately reconstructing the details of past temperature change offers insights into climate’s response to various causes of both natural and anthropogenic climate change. The responses serve as benchmarks to test how well climate models simulate the Earth’s climate system.

“Climate models are the only source of detailed quantitative climate predictions, so their fidelity is critical for planning the most effective strategies to mitigate and adapt to climate change,” he said. “Our review suggests that climate models are underestimating important climate feedbacks that can amplify global warming.”

New compound that withstands extreme heat and electricity could lead to next-generation energy storage devices

Flexible polymers made with a new generation of the Nobel-winning “click chemistry” reaction find use in capacitors and other applications

Peer-Reviewed Publication

DOE/LAWRENCE BERKELEY NATIONAL LABORATORY

Featured image 

IMAGE: A NEW TYPE OF POLYSULFATE COMPOUND CAN BE USED TO MAKE POLYMER FILM CAPACITORS THAT STORE AND DISCHARGE HIGH DENSITY OF ELECTRICAL ENERGY WHILE TOLERATING HEAT AND ELECTRIC FIELDS BEYOND THE LIMITS OF EXISTING POLYMER FILM CAPACITORS. view more 

CREDIT: YI LIU AND HE (HENRY) LI/BERKELEY LAB

– By Rachel Berkowitz

Society’s growing demand for high-voltage electrical technologies – including pulsed power systems, cars and electrified aircraft, and renewable energy applications – requires a new generation of capacitors that store and deliver large amounts of energy under intense thermal and electrical conditions. Researchers at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and Scripps Research have now developed a new polymer-based device that efficiently handles record amounts of energy while withstanding extreme temperatures and electric fields. The device is composed of materials synthesized via a next-generation version of the chemical reaction for which three scientists won the 2022 Nobel Prize in Chemistry.

Polymer film capacitors are electrical components that store and release energy within an electric field using a thin plastic layer as the insulating layer. They make up 50% of the global high voltage capacitor market and offer advantages including light weight, low cost, mechanical flexibility, and robust cyclability. But state-of-the-art polymer film capacitors decrease dramatically in performance with increasing temperature and voltages. Developing new materials with improved tolerance for heat and electric fields is paramount; and creating polymers with near-perfect chemistry offers a way to do so.

“Our work adds a new class of electrically robust polymers to the table. It opens many possibilities to the exploration of more robust, high performing materials,” said Yi Liu, a chemist at Berkeley Lab and senior author on the Joule study reporting the work. Liu is the Facility Director of Organic and Macromolecular Synthesis at the Molecular Foundry, a DOE Office of Science user facility at Berkeley Lab.

In addition to remaining stable when subjected to high temperatures, a capacitor needs to be a strong “dielectric” material, meaning that it remains a strong insulator when subjected to high voltages. However, few known materials systems exist that deliver both thermal stability and dielectric strength. This scarcity is due to a lack of reliable and convenient synthesis methods, as well as a lack of fundamental understanding of the relationship between polymer structure and properties. “Improving the thermal stability of existing films while retaining their electrical insulating strength is an ongoing materials challenge,” said Liu.

A long-term collaboration between researchers at the Molecular Foundry and Scripps Research Institute has now met that challenge. They used a simple and quick chemical reaction developed in 2014 that swaps out fluorine atoms in compounds that contain sulfur-fluoride bonds, to yield long polymer chains of sulfate molecules called polysulfates. This Sulfur-Fluoride Exchange (SuFEx) reaction is a next-generation version of the click chemistry reaction pioneered by K. Barry Sharpless, a chemist at Scripps Research and two-time Nobel laureate in Chemistry, along with Peng Wu, also a chemist at Scripps Research. The near-perfect yet easy-to-run reactions join separate molecular entities through strong chemical bonds that form between different reactive groups. Liu’s team had originally used a variety of thermal analysis tools to examine the basic thermal and mechanical properties of these new materials.

As part of a Berkeley Lab program to synthesize and identify novel materials that could be useful in energy storage, Liu and his colleagues now find that, surprisingly, the polysulfates have outstanding dielectric properties, especially at high electric fields and temperatures. “Several commercial and lab-generated polymers are known for their dielectric properties, but polysulfates had never been considered. The marriage between polysulfates and dielectrics is one of the novelties here,” said He Li, a postdoctoral researcher in the Molecular Foundry and in Berkeley Lab’s Materials Sciences Division, and lead author of the study. 

Inspired by the excellent baseline dielectric properties offered by polysulfates, the researchers deposited extremely thin layers of aluminum oxide (Al2O3) onto thin films of the material to engineer capacitor devices with enhanced energy storage performance. They discovered that the fabricated capacitors exhibited excellent mechanical flexibility, withstood electric fields of more than 750 million volts per meter, and performed efficiently at temperatures up to 150 degrees Celsius. In comparison, today’s benchmark commercial polymer capacitors only function reliably at temperatures lower than 120 degrees Celsius. Above that temperature, they can only withstand electric fields smaller than 500 million volts per meter, and the energy efficiency severely drops by over half. 

The work opens new possibilities for exploring robust, high performing materials for energy storage. “We have provided deep insight into the underlying mechanisms that contribute to the material’s excellent performance,” said Wu.

The polymer strikes a balance of electrical, thermal, and mechanical properties, likely due to the sulfate linkages introduced by the click chemistry reaction. Because modular chemistry accommodates extraordinary structural diversity and scalability, the same route could offer a viable path to new polymers with higher performance that meet even more demanding operational conditions.

The polysulfates are strong contenders to become new state-of-the-art polymer dielectrics. Once researchers overcome barriers in large-scale manufacturing processes for thin film materials, the devices could greatly improve the energy efficiency of integrated power systems in electric vehicles and enhance their operational reliability. 

“Who could have imagined that a wispy sulfate polymer film could fend off lightning and fire, two of the most destructive forces in the universe?!” said Sharpless. 

“We’re continuously pushing the envelope of thermal and electrical properties, and accelerating the lab-to-market transition,” Liu added.

The technology is now available for licensing by contacting ipo@lbl.gov.

The work received funding from the Department of Energy’s Office of Science, the National Science Foundation, and the National Institute of Health. The work was carried out at the Molecular Foundry. 

Polysulfates with excellent thermal properties are casted into flexible free-standing films. High-temperature, high-voltage capacitors based on such films show state-of-the-art energy storage properties at 150 oC. Such power capacitors are promising for improving the energy efficiency and reliability of integrated power systems in demanding applications such as electrified transportation.

CREDIT

Credit: Yi Liu and He (Henry) Li/Berkeley Lab

Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 16 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy’s Office of Science.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

UCalgary researchers develop new imaging technique for clearer picture of “brain in the gut”


New view of gut’s nervous system will lead to better understand of gastrointestinal disorders

Peer-Reviewed Publication

UNIVERSITY OF CALGARY


University of Calgary researchers designed a novel imaging and experimental preparation system, allowing them to record the activity of the enteric nervous system in mice. The new technique allows researchers to record what is sometimes referred to as the gut’s brain during the complex processes of digestion and waste elimination.

“This completely different way of conducting experiments allows us to better understand the complexity of the nerve interactions that are regulating and coordinating the responses by the gut’s nervous system,” says Dr. Wallace MacNaughton, PhD, co-principal investigator. “It opens up new avenues for us to understand what’s really going on, and that’s going to help us understand gastrointestinal diseases and disorders a lot better.”

Neurons, or nerve cells, embedded in the wall of the gut precisely control its movements. The team used mice genetically encoded with fluorescent labels, so the neurons in the gut’s nervous system would “light up,” glowing green under microscopes, whenever the neurons were activated. The images are already providing new insights.

 “This wave of excitation around the circumference of the gut, and the change in neuronal excitability, have never been seen before,” says Dr. Keith Sharkey, PhD, co-principal investigator. “When the gut is distended, the nerve circuits respond in ways that are totally different than when the gut is relaxed.”

The team’s study is the first that shows, in an intact gut preparation, the role of the gut’s physical distention in controlling how the entire neural network in the gut is coordinated. The findings published in the Journal of Physiology include instructions on how to replicate the technique which Sharkey describes as marrying technology with biology.

“We wanted all researchers to have access to this approach,” says Sharkey. “Gaining a better understanding of the physiology of the gut is fundamental to being able to understand what happens when it doesn’t work properly, and to developing effective treatments.”

The populations of neurons, the neural architecture, and the way the gut is arranged is virtually identical in the mouse gut and the human gut. This makes it highly likely that similar processes occur in the human gut, the researchers say.

Gastrointestinal (GI) disorders, such as irritable bowel syndrome and Crohn’s disease, impact 10 to 20 per cent of North America’s population and cost billions of dollars in health care. Yet because GI disorders are poorly understood, current treatments work for only a fraction of patients, may lose their effectiveness over time, or cause serious side effects.

Sharkey and MacNaughton now plan to investigate how probiotics, inflammation and bacterial infection alter the control and coordination of the gut’s nervous system in mice.

“This is giving us a model that may help us test new approaches to treating gastrointestinal diseases in people at some point in the future,” MacNaughton says.

The co-first authors on the study, Dr. Jean-Baptiste Cavin, PhD, and Dr. Preedajit Wongkrasant, PhD, were fundamental in designing the experimental system and refining the techniques.

 

1st observational evidence linking black holes to dark energy

Peer-Reviewed Publication

UNIVERSITY OF HAWAII AT MANOA

Supermassive black hole 

IMAGE: ARTIST'S IMPRESSION OF A SUPERMASSIVE BLACK HOLE. COSMOLOGICAL COUPLING ALLOWS BLACK HOLES TO GROW IN MASS WITHOUT CONSUMING GAS OR STARS. view more 

CREDIT: UH MĀNOA

Searching through existing data spanning 9 billion years, a team of researchers led by scientists at University of Hawaiʻi at Mānoa has uncovered the first evidence of "cosmological coupling" –a newly predicted phenomenon in Einstein's theory of gravity, possible only when black holes are placed inside an evolving universe.

Astrophysicists Duncan Farrah and Kevin Croker led this ambitious study, combining Hawaiʻi's expertise in galaxy evolution and gravity theory with the observation and analysis experience of researchers across nine countries to provide the first insight into what might exist inside real black holes.

"When LIGO heard the first pair of black holes merge in late 2015, everything changed," said Croker. "The signal was in excellent agreement with predictions on paper, but extending those predictions to millions, or billions of years?  Matching that model of black holes to our expanding universe? It wasn't at all clear how to do that."

The team has recently published two papers, one in The Astrophysical Journal and the other in The Astrophysical Journal Letters, that studied supermassive black holes at the hearts of ancient and dormant galaxies.

The first paper found that these black holes gain mass over billions of years in a way that can't easily be explained by standard galaxy and black hole processes, such as mergers or accretion of gas.

The second paper finds that the growth in mass of these black holes matches predictions for black holes that not only cosmologically couple, but also enclose vacuum energy—material that results from squeezing matter as much as possible without breaking Einstein's equations, thus avoiding a singularity.

With singularities absent, the paper then shows that the combined vacuum energy of black holes produced in the deaths of the universe's first stars agrees with the measured quantity of dark energy in our universe.

“We're really saying two things at once: that there's evidence the typical black hole solutions don't work for you on a long, long timescale, and we have the first proposed astrophysical source for dark energy,'' said Farrah, lead author of both papers.

“What that means, though, is not that other people haven't proposed sources for dark energy, but this is the first observational paper where we're not adding anything new to the universe as a source for dark energy: black holes in Einstein's theory of gravity are the dark energy.''

These new measurements, if supported by further evidence, will redefine our understanding of what a black hole is.

Nine billion years ago
In the first study, the team determined how to use existing measurements of black holes to search for cosmological coupling. 

"My interest in this project was really born from a general interest in trying to determine observational evidence that supports a model for black holes that works regardless of how long you look at them," Farrah said. "That's a very, very difficult thing to do in general, because black holes are incredibly small, they're incredibly difficult to observe directly, and they are a long, long way away." 

Black holes are also hard to observe over long timescales. Observations can be made over a few seconds, or tens of years at most—not enough time to detect how a black hole might change throughout the lifetime of the universe. To see how black holes change over a scale of billions of years is a bigger task. 

"You would have to identify a population of black holes and identify their distribution of mass billions of years ago. Then you would have to see the same population, or an ancestrally connected population, at present day and again be able to measure their mass," said co-author Gregory Tarlé, a physicist at University of Michigan. "That's a really difficult thing to do." 

Because galaxies can have life spans of billions of years, and most galaxies contain a supermassive black hole, the team realized that galaxies held the key, but choosing the right types of galaxy was essential. 

"There were many different behaviors for black holes in galaxies measured in the literature, and there wasn't really any consensus," said study co-author Sara Petty, a galaxy expert at NorthWest Research Associates. "We decided that by focusing only on black holes in passively evolving elliptical galaxies, we could help to sort this thing out." 

Elliptical galaxies are enormous and formed early. They are fossils of galaxy assembly. Astronomers believe them to be the final result of galaxy collisions, enormous in size with upwards of trillions of old stars. 

By looking at only elliptical galaxies with no recent activity, the team could argue that any changes in the galaxies' black hole masses couldn't easily be caused by other known processes. Using these populations, the team then examined how the mass of their central black holes changed throughout the past 9 billion years. 

If mass growth of black holes only occurred through accretion or merger, then the masses of these black holes would not be expected to change much at all. However if black holes gain mass by coupling to the expanding universe, then these passively evolving elliptical galaxies might reveal this phenomenon. 

The researchers found that the further back in time they looked, the smaller the black holes were in mass, relative to their masses today. These changes were big: The black holes were anywhere from 7 to 20 times larger today than they were 9 billion years ago—big enough that the researchers suspected cosmological coupling could be the culprit. 

Unlocking black holes 

In the second study, the team investigated whether the growth in black holes measured in the first study could be explained by cosmological coupling alone. 

"Here's a toy analogy. You can think of a coupled black hole like a rubber band, being stretched along with the universe as it expands," said Croker. "As it stretches, its energy increases. Einstein's E = mc2 tells you that mass and energy are proportional, so the black hole mass increases, too." 

How much the mass increases depends on the coupling strength, a variable the researchers call k

"The stiffer the rubber band, the harder it is to stretch, so the more energy when stretched. In a nutshell, that's k," Croker said. 

Because mass growth of black holes from cosmological coupling depends on the size of the universe, and the universe was smaller in the past, the black holes in the first study must be less massive by the correct amount in order for the cosmological coupling explanation to work. 

The team examined five different black hole populations in three different collections of elliptical galaxies, taken from when the universe was roughly one half and one third of its present size. In each comparison, they measured that k was nearly positive 3. 

The first observational link

In 2019, this value was predicted for black holes that contain vacuum energy, instead of a singularity by Croker, then a graduate student, and Joel Weiner, a UH Mānoa mathematics professor. 

The conclusion is profound: Croker and Weiner had already shown that if k is 3, then all black holes in the universe collectively contribute a nearly constant dark energy density, just like measurements of dark energy suggest. 

Black holes come from dead large stars, so if you know how many large stars you are making, you can estimate how many black holes you are making and how much they grow as a result of cosmological coupling. The team used the very latest measurements of the rate of earliest star formation provided by the James Webb Space Telescope and found that the numbers line up. 

According to the researchers, their studies provide a framework for theoretical physicists and astronomers to further test—and for the current generation of dark energy experiments such as the Dark Energy Spectroscopic Instrument and the Dark Energy Survey—to shed light on the idea. 

"If confirmed this would be a remarkable result, pointing the way towards the next generation of black hole solutions," said Farrah.

Croker added, "This measurement, explaining why the universe is accelerating now, gives a beautiful glimpse into the real strength of Einstein's gravity. A chorus of tiny voices spread throughout the universe can work together to steer the entire cosmos. How cool is that?"

  

Researchers studied elliptical galaxies like Messier 59 to determine if the mass of their central black holes changed throughout the past 9 billion years. The smooth distribution of light is billions of stars.

CREDIT

ESA/Hubble & NASA, P. Cote


Caldwell 53 (NGC 3115) is most notable for the supermassive black hole that can be found at its center.

CREDIT

NASA, ESA, and J. Erwin (University of Alabama)

Measurement of coupling strength k by comparing black hole masses in 5 different collections of ancient elliptical galaxies to the black holes in elliptical galaxies today. Measurements cluster around k = 3, implying that black holes contain vacuum energy, instead of a singularity.

CREDIT

Farrah, et al. 2023 [the ApJ Letter]

 

Amazon mammals threatened by climate change

Study reveals impacts of Savannization on Brazilian Amazon land animals

Peer-Reviewed Publication

UNIVERSITY OF CALIFORNIA - DAVIS

Jaguars in Amazon 

IMAGE: TWO JAGUARS, CAUGHT WITH A CAMERA TRAP SURVEY, WALK THROUGH THE BRAZILIAN AMAZON RAINFOREST. view more 

CREDIT: DANIEL ROCHA/UC DAVIS

From jaguars and ocelots to anteaters and capybara, most land-based mammals living in the Brazilian Amazon are threatened by climate change and the projected savannization of the region. That’s according to a study published in the journal Animal Conservation by the University of California, Davis.

[Leia esse artigo em português.]

The study found that even animals that use both forest and savanna habitats, such as pumas and giant armadillos, are vulnerable to such changes. It also illustrates how species and lands protected through local conservation efforts are not immune to global climate change.

“We’re losing Amazon forest as we speak,” said lead author Daniel Rocha, who conducted the research as a doctoral student in the UC Davis Department of Wildlife, Fish and Conservation Biology. “The Amazon’s biodiversity is very susceptible to climate change effects. It’s not just local; it’s a global phenomenon. We cannot stop this just by law enforcement, for example. These species are more susceptible than we realized, and even protected areas can’t protect them as much as we thought.”

 What is ‘savannization?’

Pristine savanna is a unique biome that supports a diverse array of life. But “savannization” here refers to when lush rainforest gives way to a drier, open landscape that resembles savanna but is actually degraded forest. Local deforestation and global climate changes in temperature and precipitation favor this conversion along the southern and eastern edges of the Brazilian Amazon.

Arboreal species like monkeys clearly will be impacted by such changes. But the study’s authors wanted to better understand how land-based mammals are expected to fare — especially those who use both forest and savanna habitats when they have access to both.

 Caught on camera

For the study, the researchers conducted camera trap surveys of land-based mammals in four protected areas of the southern Brazilian Amazon, which is a mixture of rainforest and natural Cerrado, or savanna. Using statistical models, they quantified how 31 species were affected by savanna habitat. They then looked for differences among species known to use mostly rainforest, savanna, or both habitats. 

The results showed that only a few species preferred savanna habitat. Rocha notes that the models were based on pristine — not degraded — savanna, so the negative effects of savannization among animals will likely be even stronger.

Riparian forests, which line the wet edges of rivers and streams, helped buffer the effects of savannization to some extent.

Winners and losers

“Unfortunately, there are more losers than winners,” said Rocha, who is currently an assistant professor at Southern Nazarene University in Oklahoma. “Most Amazon species, when they can choose between good forests and good savanna, they choose the forest. That’s true even for species considered ‘generalists,’ which use both habitats.  As we lose forests, they suffer, too.”

The results indicate that if climate-driven savannization causes species to lose access to their preferred habitat, it will reduce the ability of even protected areas to safeguard wildlife. The authors say that should be considered when assessing the potential climate-change effects on these species.

The study is co-authored by Rahel Sollmann, Rocha’s former advisor at UC Davis who is now at the Leibniz Institute for Zoo and Wildlife Research in Berlin, Germany.

The study was funded by CAPES, the National Geographic Society, Horodas Family Foundation for Conservation Research, The Explorers Club, Alongside Wildlife Foundation, and the Hellman Foundation. This study received logistical support from ICMBio.

Scientists examine jaguar tracks on a road in the Brazilian Amazon.

CREDIT

Fernanda Cavalcante/PCMC Brasil