It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Friday, September 02, 2022
High plant diversity is often found in the smallest of areas
It might sound weird, but it's true: the steppes of Eastern Europe are home to a similar number of plant species as the regions of the Amazon rainforest. However, this is only apparent when species are counted in small sampling areas, rather than hectares of land. An international team of researchers led by the Martin Luther University Halle-Wittenberg (MLU) and the German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig has now shown how much estimates of plant diversity change when the sampling area ranges from a few square metres to hectares. Their results have been published in the journal Nature Communications and could be utilised in new, more tailored nature conservation concepts.
In their study, the team analysed a dataset of around 170,000 vegetation plots from all of the Earth’s climate zones. The data included information on all of the plant species found at a location and the coordinates of the respective area under study. The data was taken from the globally unique vegetation database "sPlot", which is located at iDiv.
"Most studies on global biodiversity are conducted on a relatively large scale, for example at a state or provincial scale. We wanted to find out how much results differ when smaller areas are examined," says Professor Helge Bruelheide from MLU. The team used artificial intelligence to investigate, among other things, the relationship between the number of plant species and the size of the area under study.
Their investigation showed that there are regions on Earth where focusing on large study areas only provide a limited understanding of the distribution of biodiversity: sometimes small areas can have a relatively high biodiversity, for example in the steppes of Eastern Europe, in Siberia and in the Alpine countries of Europe. At fine spatial scales, the large difference in biodiversity between the tropics, like the Amazon, and the temperate climate zones nearly disappears.
The same applies to the African tropics, which were previously considered an exception in the tropical plant world. "The tropics have always been among the most biodiverse areas in the world. We wondered why this shouldn’t also apply to Western Africa," explains Dr Francesco Maria Sabatini, who led the study at MLU and is now an assistant professor at the University of Bologna. In fact, the distribution of plant species varies greatly in the African tropics, says Sabatini. These species are distributed over very large distances, so that they are not always recorded when a small sampling area is examined. "To correctly recognize the high biodiversity in Western Africa many small areas are required," adds Sabatini.
The study also shows that the spatial scale at which other very biodiverse areas are examined, such as the Cerrado savanna region in Brazil or regions in Southeast Asia, is irrelevant. These results are also important when it comes to protecting species. "Ecosystems whose high biodiversity is spread out over a large area cannot be protected through the traditional patchwork of nature reserves. In contrast, ecosystems that have a high biodiversity within a small area could benefit well from several distinct protected zones," concludes Bruelheide.
The study was funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation).
Study: Sabatini F. M. et al. Global patterns of vascular plant alpha diversity. Nature Communications (2022). doi: 10.1038/s41467-022-32063-z
Ferns are notorious for containing massive amounts of DNA and an excessively large number of chromosomes. Defying all expectations, a fern no larger than a dinner plate currently holds the title for highest chromosome count, with a whopping 720 pairs crammed into each of its nuclei. This penchant of ferns for hoarding DNA has stumped scientists, and the intractable size of their genomes has made it difficult to sequence, assemble and interpret them.
Now, two papers published in the journal Nature Plants are rewriting history with the first full-length genomes for homosporous ferns, a large group that contains 99% of all modern fern diversity.
“Every genome tells a different story,” said co-author Doug Soltis, a distinguished professor with the Florida Museum of Natural History. "Ferns are the closest living relatives of all seed plants, and they produce chemical deterrents to herbivores that may be useful for agricultural research. Yet until now, they've remained the last major lineage of green life without a genome sequence.”
Two teams of researchers separately unveiled the genome of Ceratopteris (Ceratopteris richardii) this Thursday and that of the flying spider monkey tree fern (Alsophila spinulosa) last month.
Analysis of the Ceratopteris genome provides hints for solving the long-standing mystery of why ferns, on average, retain more DNA than other plants. Comparisons to genomes from other groups also led to the surprise discovery that ferns stole the genes for several of their anti-herbivory toxins from bacteria.
The Ceratopteris genome bucks a decades-old theory, leaving more questions than answers
Since the 1960s, the most favored explanation for why ferns contain so much DNA invoked rampant whole-genome duplications, in which an extra set of chromosomes is accidentally passed on to an organism’s offspring. This can sometimes be beneficial, as all the extra genes can then be used as raw material for the evolution of new traits. In fact, whole-genome duplication has been implicated in the origin of nearly all crop plants.
Whole-genome duplication is common in plants and even some animals, but most organisms tend to jettison the extra genetic baggage over time, slimming back down to smaller genomes that are metabolically easier to maintain.
“This has been a major point of discussion for the last half-century and has led to all kinds of conflicting results,” said lead author Blaine Marchant, a postdoctoral scholar at Stanford University and former Florida Museum graduate student. “Trying to figure out the evolutionary process underlying this paradox is incredibly important.”
With the first fully assembled homosporous fern genomes, scientists were finally prepared to address this question, but getting there wasn’t easy. Sequencing the large, complex genome of Ceratopteris took over eight years of work and the combined effort of dozens of researchers from 28 institutions around the world, including the U.S. Department of Energy Joint Genome Institute. The final result was 7.46 gigabases of DNA, more than double the size of the human genome.
If Ceratopteris had bulked up on DNA through repeated genome duplication events, researchers expected large portions of its 39 chromosome pairs would be identical. What they found instead was a mixed bag of repetitive sequences and millions of short snippets called jumping genes, which accounted for 85% of the fern’s DNA. Rather than multiple genome copies, Ceratopteris mostly contains genetic debris accumulated over millions of years.
“The functional genes are separated by large amounts of repetitive DNA. And although we’re not yet sure how the Ceratopteris and other fern genomes got so big, it’s clear that the prevailing view of repeated episodes of genome duplication is not supported,” said co-author Pam Soltis, a Florida Museum curator and distinguished professor.
The authors note that it’s too early to make any firm conclusions, especially since this is the first analysis of its scope conducted in this group. Cross comparisons with additional fern genomes down the road will help paint a clearer picture of how these plants evolved.
Still, the results point to a clear difference in the way homosporous ferns manage their genetic content compared to almost all other plants, Marchant said.
“What we seem to be finding is that things like flowering plants, which on average have much smaller genomes than ferns, are just better at getting rid of junk DNA. They’re better at dropping spare chromosomes and even downsizing after small duplications.”
Ferns repeatedly stole toxins from bacteria
A closer look at the billions of DNA base pairs within Ceratopteris revealed multiple defense genes that code for a particularly sinister type of pore-forming toxin. These toxins bind to cells, where they become activated and form small, hollow rings that punch their way into the cellular membrane. Water floods into the cells through the resulting holes, causing them to rupture.
Pore-forming toxins have been intensively studied by scientists for their potential use in nanopore technology, Marchant explained. Most often, however, they’re found in bacteria.
“This is the first concrete evidence of these bacterial toxin-related genes within fern DNA,” Marchant said, noting that the similarity isn’t a coincidence.
Rather than evolving this toxin on its own, Ceratopteris appears to have obtained it directly from bacteria through a process called horizontal gene transfer. And given that there were multiple copies of the gene spread out among three separate chromosomes, it’s likely this happened more than once.
“What’s fascinating is that the many copies of these genes show up in different parts of the plant,” he said. “Some are highly expressed in the stem and roots, while other copies are expressed solely in the leaves, and others are generally expressed across all tissues. We cannot be sure of the exact function of these genes at this point, but their similarity to the toxin-forming genes in bacteria certainly suggests these genes are defense-related.”
This wouldn’t be the first time ferns have incorporated foreign DNA into their genomes. A 2014 study indicates ferns may have evolved their characteristic ability to grow in shady environments by borrowing genes from distantly related plants.
However, exactly how organisms separated by millions of years of evolution are able to swap fully functional genes remains unclear.
“The mechanisms behind horizontal gene transfer remain one of the least investigated areas of land plant evolution,” Doug Soltis explained. “Over evolutionary timescales, it’s a bit like winning the lottery. Any time a plant is wounded, its interior is susceptible to invasion from microbes, but for their DNA to be incorporated into the genome seems amazing.”
The authors say this is merely the first step in a long series of studies with practical applications ranging from the development of novel biopesticides to innovative new conservation strategies.
Several of the authors are involved in the current effort to sequence the genomes of all known eukaryotic organisms within a 10-year time frame. Called the Earth Biogenome Project, the endeavor will generate untold genomic resources that researchers will have their hands full analyzing for the foreseeable future.
A Policy Forum article published today in Science calls for a new approach to regulating genetically engineered (GE) crops, arguing that current approaches for triggering safety testing vary dramatically among countries and generally lack scientific merit – particularly as advances in crop breeding have blurred the lines between conventional breeding and genetic engineering.
Rather than focusing on the methods and processes behind the creation of a GE crop to determine if testing is needed, a more effective framework would examine the specific new characteristics of the crop itself by using so-called “-omics” approaches, the article asserts. In the same way that biomedical sciences can use genomic approaches to scan human genomes for problematic mutations, genomics can be used to scan new crop varieties for unexpected DNA changes.
Additional “-omics” methods such as transcriptomics, proteomics, epigenomics and metabolomics test for other changes to the molecular composition of plants. These measurements of thousands of molecular traits can be used like a fingerprint to determine whether the product from a new variety is “substantially equivalent” to products already being produced by existing varieties – whether, for example, a new peach variety has molecular characteristics that are already found in one or more existing commercial peach varieties.
If the new product has either no differences or understood differences with no expected health or environmental effects when compared with products of existing varieties, no safety testing would be recommended, the article suggests. If, however, the product has new characteristics that have the potential for health or environmental effects, or if the product has differences that cannot be interpreted, safety testing would be recommended.
“The approaches used right now – which differ among governments – lack scientific rigor,” said Fred Gould, University Distinguished Professor at North Carolina State University, co-director of NC State’s Genetic Engineering and Society Center and the corresponding author of the article. “The size of the change made to a product and the origin of the DNA have little relationship with the results of that change; changing one base pair of DNA in a crop with 2.5 billion base pairs, like corn, can make a substantial difference.”
When dealing with varieties made using the powerful gene editing system known as CRISPR, for example, the European Union regulates all varieties while other governments base decisions on the size of the genetic change and the source of inserted genetic material. Meanwhile, in 2020 the U.S. Department of Agriculture established a rule that exempts from regulation conventionally bred crop varieties and GE crop varieties that could have been developed by methods other than genetic engineering.
The “-omics” approaches, if used appropriately, would not increase the cost of regulation, Gould said, adding that most new varieties would not trigger a need for regulation.
“The most important question is, ‘Does the new variety have unfamiliar characteristics,’” Gould said. The paper estimates that technological advances could make the laboratory cost for a set of “-omics” tests decrease to about $5,000 within five to 10 years.
Establishing an international committee composed of crop breeders, chemists and molecular biologists to establish the options and costs of “-omics” approaches for a variety of crops would start the process of developing this new regulatory framework. Workshops with these experts as well as sociologists, policymakers, regulators and representatives of the general public would enable trustworthy deliberations that could avoid some of the problems encountered when GE rolled out in the 1990s. National and international governing bodies should sponsor these committees and workshops as well as innovative research to get the ball rolling and ensure that assessments are accessible and accurate, Gould said.
In 2016, Gould headed a 20-member National Academy of Sciences committee responsible for a report, Genetically Engineered Crops: Experiences and Prospects, which aimed to “assess the evidence for purported negative effects of GE crops and their accompanying technologies” and to “assess the evidence for purported benefits of GE crops and their accompanying technologies.” Most of that committee co-authored the policy article published this week.
Some of the ideas for this article emerged from discussions during deliberations of the US National Academy of Sciences, Engineering and Medicine “Committee on Genetically Engineered Crops: Past Experiences and Future Prospects”. K.G. is a member of the Sustainable Sourcing Advisory Board of Unilever and of the Scientific Advisory Committee of the African Plant Nutrition Institute. R.D. hold patents in the field of metabolic engineering for improvement of forage crops; received funding for research on the above topic from GrasslaNZ Technologies, New Zealand, and Forage Genetics International, USA; is Chief Scientist, Beijing Advanced Innovation Center for Tree Breeding by Molecular Design, Beijing, China. 2017-2021; Chair of the Academic Advisory Committee for the Agricultural Biotechnology Research Center, Academia Sinica, Taipei, Taiwan; Member of the Departmental Evaluation Board, Flanders Institute of Biotechnology (VIB), Department of Plant Systems Biology, Gent, Belgium. C.N.S.Jr. holds patents in plant biotechnology.
A sustainable battery with a biodegradable electrolyte made from crab shells
Accelerating demand for renewable energy and electric vehicles is sparking a high demand for the batteries that store generated energy and power engines. But the batteries behind these sustainability solutions aren’t always sustainable themselves. In a paper publishing September 1 in the journal Matter, scientists create a zinc battery with a biodegradable electrolyte from an unexpected source—crab shells.
“Vast quantities of batteries are being produced and consumed, raising the possibility of environmental problems,” says lead author Liangbing Hu, director of the University of Maryland’s Center for Materials Innovation. “For example, polypropylene and polycarbonate separators, which are widely used in Lithium-ion batteries, take hundreds or thousands of years to degrade and add to environmental burden.”
Batteries use an electrolyte to shuttle ions back and forth between positively and negatively charged terminals. An electrolyte can be a liquid, paste, or gel, and many batteries use flammable or corrosive chemicals for this function. This new battery, which could store power from large-scale wind and solar sources, uses a gel electrolyte made from a biological material called chitosan.
“Chitosan is a derivative product of chitin. Chitin has a lot of sources, including the cell walls of fungi, the exoskeletons of crustaceans, and squid pens,” says Hu. “The most abundant source of chitosan is the exoskeletons of crustaceans, including crabs, shrimps and lobsters, which can be easily obtained from seafood waste. You can find it on your table.”
A biodegradable electrolyte means that about two thirds of the battery could be broken down by microbes—this chitosan electrolyte broke down completely within five months. This leaves behind the metal component, in this case zinc, rather than lead or lithium, which could be recycled.
“Zinc is more abundant in earth’s crust than lithium,” says Hu. “Generally speaking, well-developed zinc batteries are cheaper and safer.” This zinc and chitosan battery has an energy efficiency of 99.7% after 1000 battery cycles, making it a viable option for storing energy generated by wind and solar for transfer to power grids.
Hu and his team hope to continue working on making batteries even more environmentally friendly, including the manufacturing process. “In the future, I hope all components in batteries are biodegradable,” says Hu. “Not only the material itself but also the fabrication process of biomaterials.”
###
This work was supported by the Research Corporation for Science Advancement, Facebook Reality Labs Research, the University of Maryland A. James Clark School of Engineering and Maryland Nanocenter, and AIMLab.
Matter (@Matter_CP), published by Cell Press, is a new journal for multi-disciplinary, transformative materials sciences research. Papers explore scientific advancements across the spectrum of materials development—from fundamentals to application, from nano to macro. Visit: https://www.cell.com/matter. To receive Cell Press media alerts, please contactpress@cell.com.
A sustainable chitosan-zinc electrolyte for high-rate zinc metal batteries
ARTICLE PUBLICATION DATE
1-Sep-2022
Disclaimer: AAAS and EurekAle
Social cost of carbon more than triple the current federal estimate, new study finds
A multi-year study of the social cost of carbon, a critical input for climate policy analysis, finds that every additional ton of carbon dioxide emitted into the atmosphere costs society $185—far higher than the current federal estimate of $51 per ton.
After years of robust modeling and analysis, a multi-institutional team led by researchers from Resources for the Future (RFF) and the University of California, Berkeley (UC Berkeley), has released an updated social cost of carbon estimate that reflects new methodologies and key scientific advancements. The study, published today in the journal Nature, finds that each additional ton of carbon dioxide emitted into the atmosphere costs society $185 per ton—3.6 times the current US federal estimate of $51 per ton.
The social cost of carbon is a critical metric that measures the economic damages, in dollars, that result from the emission of one additional ton of carbon dioxide into the atmosphere. A high social cost of carbon can motivate more stringent climate policies, as it increases the estimated benefits of reducing greenhouse gases.
“Our estimate, which draws on recent advances in the scientific and economic literature, shows that we are vastly underestimating the harm of each additional ton of carbon dioxide that we release into the atmosphere,” said RFF President and CEO Richard G. Newell, who coauthored the peer-reviewed paper. “The implication is that the benefits of government policies and other actions that reduce global warming pollution are greater than has been assumed.”
The study, led by UC Berkeley Associate Professor David Anthoff and RFF Fellow Kevin Rennert, brought together leading researchers from institutions across the United States to develop important updates to social cost of carbon modeling. These advances include consideration of the probability of different socioeconomic and emissions trajectories far into the future; the incorporation of a modern representation of the climate system; and state-of-the-art scientific methodologies for assessing the effects of climate change on agriculture, temperature-related deaths, energy expenditures, and sea-level rise. The estimate also takes into account an updated approach to evaluating future climate risks through ‘discounting’ that is linked to future economic uncertainty. The $185-per-ton value is the central estimate of many that includes the inherent uncertainty in these trajectories.
Notably, the new Nature study is fully responsive to the methodological recommendations of a seminal 2017 National Academies report co-chaired by Newell and RFF’s Maureen Cropper. A federal interagency working group on the social costs of greenhouse gases, disbanded during the previous administration but reestablished by an executive order from President Biden, is also updating its social cost of carbon estimate using the 2017 recommendations.
“We hope that our research helps inform the anticipated updated social cost of carbon from the government’s interagency working group,” said Brian C. Prest, coauthor and director of RFF’s Social Cost of Carbon Initiative. “Decisions are only as strong as the science behind them. And our study finds that carbon dioxide emissions are more costly to society than many people likely realize.”
Aside from the estimate itself, a major output of the study is the Greenhouse Gas Impact Value Estimator (GIVE) model, an open-source software platform that allows users to replicate the team’s methodology or compute their own social cost of carbon estimates. Also released today is a new data tool, the Social Cost of Carbon Explorer, which demonstrates the working mechanics of the GIVE model and allows users to explore the data in detail.
“Our hope is that the freely available, open-source GIVE model we’re introducing today forms the foundation for continuous improvement of the estimates by an expanded community of scientists worldwide,” Rennert said. “A completely transparent methodology has been a guiding principle for our work, which is also directly relevant to other greenhouse gases, such as methane and nitrous oxides.”
Anthoff emphasized that the diverse expertise of the paper’s authors stems from the multi-faceted nature of the research. “Estimating the social cost of carbon requires inputs from many academic disciplines,” he said. “When we started this project, we knew that we would only succeed by assembling a team of leading researchers in each discipline to contribute their expertise. I am especially proud of the all-star group of researchers across so many leading institutions that jointly worked on this paper.”
For more, read the new paper, “Comprehensive Evidence Implies a Higher Social Cost of CO₂,” by Kevin Rennert (RFF), Frank Errickson (Princeton University), Brian C. Prest (RFF), Lisa Rennels (UC Berkeley), Richard G. Newell (RFF), Billy Pizer (RFF), Cora Kingdon (UC Berkeley), Jordan Wingenroth (RFF), Roger Cooke (RFF), Bryan Parthum (US Environmental Protection Agency), David Smith (US Environmental Protection Agency), Kevin Cromar (New York University), Delavane Diaz (EPRI), Frances C. Moore (University of California, Davis), Ulrich K. Müller (Princeton University), Richard J. Plevin, Adrian E. Raftery (University of Washington), Hana Ševčíková (University of Washington), Hannah Sheets (Rochester Institute of Technology), James H. Stock (Harvard University), Tammy Tan (US Environmental Protection Agency), Mark Watson (Princeton University), Tony E. Wong (Rochester Institute of Technology), and David Anthoff (UC Berkeley).
A market-led approach could be key to guiding policy, research and business decisions about future climate risks, a new study outlines.
Published in the journal Nature Climate Change, the paper from academics at the Universities of Lancaster and Exeter details how expert ‘prediction markets’ could improve the climate-risk forecasts that guide key business and regulatory decisions.
Organisations now appreciate that they have to consider climate risks within their strategic plans – whether that relates to physical risks to buildings and sites, or risks associated with transitioning to achieve net zero.
However, the forward-looking information needed to inform these strategic decisions is limited, the researchers say.
Dr Kim Kaivanto, a co-author from Lancaster University’s Department of Economics, said: “The institutional arrangements under which climate-risk information is currently provided mirrors the incentive problems and conflicts of interest that prevailed in the credit-rating industry prior to the 2007/8 financial crisis.
“In order to make sense of emissions scenarios and to support planning and decision-making, organisations have a pressing need for this type of forward-looking expert risk information.
“Understanding climate risks requires diverse and complementary expertise from political science, economics and policy, as well as country-specific knowledge on the major emitters. Prediction markets incentivise and reward participants with distinct expertise and information to come forward – and they offer a level playing field for experts from these complementary fields of expertise.”
Mark Roulston, one of the Exeter University co-authors said, “If providers of climate forecasts are paid upfront irrespective of accuracy, you don’t need to be an economist to spot the problem with that arrangement.”
In their paper, ‘Prediction-market innovations can improve climate-risk forecasts’ the authors detail how expert ‘prediction markets’ can help overcome the structural problems and shortfalls in the provision of forward-looking climate-risk information – something that will become more vital as the demand for long-range climate information increases.
Prediction markets are designed to incentivise those with important information to come forward, and facilitate the aggregation of information through the buying and selling of contracts that yield a fixed payoff if the specified event occurs. An outcome of interest – such as average CO2 concentration in the year 2040, for example – is partitioned into intervals. Expert participants compare the results of their own modelling with the prices of these intervals, and purchase or sell claims on these intervals if their model suggests the price is too low or too high.
These kinds of long-range markets have not been established to date due, in part, to regulatory obstacles. However, the researchers believe the markets can be designed to overcome these obstacles by avoiding the ‘pay-to-play’ aspect of existing prediction markets in which the losses of less-well-informed individuals fund the winnings of better-informed individuals. Instead, markets can be structured as vehicles for distributing research funding to experts and modellers in a manner that is consistent with the principles of effective altruism: an initial stake provided by a sponsor is distributed to participants in accordance with the quality and quantity of information they bring into the market through their trading activity.
They add that access to participation in the markets would need to have selection criteria to ensure diversity of views and a range of expertise to ensure they are able to aggregate diverse sources of information.
The paper’s authors are Kim Kaivanto of Lancaster University, and Mark Roulston, Todd Kaplan and Brett Day of the University of Exeter.
Much of the world’s efforts to mitigate the effects of climate change hinge on the success of the landmark 2015 Paris Agreement. A new Nature Climate Changestudy is the first to provide scientific evidence assessing how effective governments will be at implementing their commitments to the agreement that will reduce CO2 emissions causing climate change.
The research reveals that the countries with the boldest pledges are also the most likely to achieve their goals. Europe takes the lead with the strongest commitments that are also the most credible; however, findings suggest the U.S., despite having a less ambitious commitment under Paris, is not expected to meet its pledges.
The study from the University of California San Diego’s School of Global Policy and Strategy integrates a novel sample of registrants of the Conference of Parties (COP), consisting of more than 800 diplomatic and scientific experts who, for decades, have participated in climate policy debates. This expert group was important to survey because they are the people “in the room” when key policy decisions are made and therefore in a unique position to evaluate what their countries and other countries are likely to achieve.
They were asked to rate member nations—their own country included—to gauge pledge ambition, which is how much each country has pledged to do to mitigate global warming, in comparison to what they feasibly could do, given their economic strength, to avert a climate crisis. They also were asked to evaluate the degree to which nations have pledges that are credible.
“The pledges outlined in the accords are legally non-binding, thus the success of the agreement centers around confidence in the system that when governments make promises, they are going to live up those promises,”said the study’s lead author David Victor, professor of industrial innovation at UC San Diego’s School of Global Policy and Strategy and co-director of the Deep Decarbonization Initiative.
Victor added, “Our results indicate that the framework of the agreement is working pretty well. The Paris Agreement is getting countries to make ambitious pledges; last year nearly all countries updated those pledges and made them even more ambitious. What’s needed next is better systems for checking to see whether countries are actually delivering what they promise.”
A subset of survey responses from eight countries plus the EU were selected for being most relevant to climate mitigation policy. They rate Europe’s goals as the most ambitious and credible. Europe is followed by China, Australia, South Africa and India. The U.S. and Brazil come in last place in the credibility category and second to last, after Saudi Arabia, in terms of ambition.
Surveys where respondents were asked to rate their home country were categorized by continent to elicit the most candid responses possible. In this analysis, experts from North American countries were the most pessimistic about their pledges, both in their drive and ability to achieve climate goals in the agreement.
CAPTION
A subset of survey responses from eight countries plus the EU were selected for being most relevant to climate mitigation policy. They rate Europe’s goals as the most ambitious and credible. Europe is followed by China, Australia, South Africa and India. The U.S. and Brazil come in last place in the credibility category and second to last, after Saudi Arabia, in terms of ambition.
CREDIT
UC San Diego's School of Global Policy and Strategy
Study data incorporates judgement, intuition and experience of climate policy experts
“The benefit of this data set is that diplomatic and scientific experts have the best working knowledge about political and administrative realities of their home country,” Victor said. “It is difficult to get empirical information on national laws and regulations and climate change policy in particular is highly complex. To truly gauge the success of the Paris Agreement, you need to incorporate the judgement, intuition and expertise from those with real-world experience negotiating these policies.”
He added, “from all the responses, it’s clear the U.S. is clearly in trouble—even with the recent Inflation Reduction Act being signed into law, which happened after our study ended. While the legislation is a big step in the right direction, it doesn’t deliver the same investment many other counties have already committed. I think the major questions our study raises are ‘how does the U.S. boost its credibility’ and ‘why is credibility a problem.’”
Victor, also a nonresident senior fellow at the Brookings Institution, and co-authors did a statistical analysis of the data set and found nations with more stable governments are more likely to have bold pledges that are highly credible.
The authors find China and other non-democracies are expected to comply with their pledges not simply because many of them have less ambitious pledges, but because they also have administrative and political systems that make it easier to implement complex national policies needed to align their countries with international commitments. In addition, China is on track to achieve its goals due to the country’s economic downturn.
The rationale that leading policy experts cite for why their countries are making and honoring their pledges varies a lot. For the wealthier countries, the key rationale is climate change. For most of the rest of the world—including the developing countries that are most vulnerable to climate change—experts cite the need to address air pollution and opportunities to grow their economies through climate action as a major driver.
The UC San Diego contribution to this study is part of the university’s Deep Decarbonization Initiative. The other authors on the paper are Marcel Lumkowsky and Astrid Dannenberg, both of the University of Kassel. Dannenberg is also affiliated with the University of Gothenburg
The study “Determining the credibility of commitments in international climate policy” published in Nature Climate Change, can be found on this link.