Friday, September 02, 2022

Ferns finally get a genome, revealing a history of DNA hoarding and kleptomania

Peer-Reviewed Publication

FLORIDA MUSEUM OF NATURAL HISTORY

Ceratopteris 

IMAGE: ANALYSIS OF THE CERATOPTERIS GENOME PROVIDES HINTS FOR SOLVING THE LONG-STANDING MYSTERY OF WHY FERNS, ON AVERAGE, RETAIN MORE DNA THAN OTHER PLANTS. COMPARISONS TO GENOMES FROM OTHER GROUPS ALSO LED TO THE SURPRISE DISCOVERY THAT FERNS STOLE THE GENES FOR SEVERAL OF THEIR ANTI-HERBIVORY TOXINS FROM BACTERIA. view more 

CREDIT: PHOTO BY DAVID RANDALL, WESTERN SYDNEY UNIVERSITY

Ferns are notorious for containing massive amounts of DNA and an excessively large number of chromosomes. Defying all expectations, a fern no larger than a dinner plate currently holds the title for highest chromosome count, with a whopping 720 pairs crammed into each of its nuclei. This penchant of ferns for hoarding DNA has stumped scientists, and the intractable size of their genomes has made it difficult to sequence, assemble and interpret them.

Now, two papers published in the journal Nature Plants are rewriting history with the first full-length genomes for homosporous ferns, a large group that contains 99% of all modern fern diversity.

“Every genome tells a different story,” said co-author Doug Soltis, a distinguished professor with the Florida Museum of Natural History. "Ferns are the closest living relatives of all seed plants, and they produce chemical deterrents to herbivores that may be useful for agricultural research. Yet until now, they've remained the last major lineage of green life without a genome sequence.” 

Two teams of researchers separately unveiled the genome of Ceratopteris (Ceratopteris richardii) this Thursday and that of the flying spider monkey tree fern (Alsophila spinulosa) last month.

Analysis of the Ceratopteris genome provides hints for solving the long-standing mystery of why ferns, on average, retain more DNA than other plants. Comparisons to genomes from other groups also led to the surprise discovery that ferns stole the genes for several of their anti-herbivory toxins from bacteria.

The Ceratopteris genome bucks a decades-old theory, leaving more questions than answers

Since the 1960s, the most favored explanation for why ferns contain so much DNA invoked rampant whole-genome duplications, in which an extra set of chromosomes is accidentally passed on to an organism’s offspring. This can sometimes be beneficial, as all the extra genes can then be used as raw material for the evolution of new traits. In fact, whole-genome duplication has been implicated in the origin of nearly all crop plants. 

Whole-genome duplication is common in plants and even some animals, but most organisms tend to jettison the extra genetic baggage over time, slimming back down to smaller genomes that are metabolically easier to maintain.

“This has been a major point of discussion for the last half-century and has led to all kinds of conflicting results,” said lead author Blaine Marchant, a postdoctoral scholar at Stanford University and former Florida Museum graduate student. “Trying to figure out the evolutionary process underlying this paradox is incredibly important.”

With the first fully assembled homosporous fern genomes, scientists were finally prepared to address this question, but getting there wasn’t easy. Sequencing the large, complex genome of Ceratopteris took over eight years of work and the combined effort of dozens of researchers from 28 institutions around the world, including the U.S. Department of Energy Joint Genome Institute. The final result was 7.46 gigabases of DNA, more than double the size of the human genome.

If Ceratopteris had bulked up on DNA through repeated genome duplication events, researchers expected large portions of its 39 chromosome pairs would be identical. What they found instead was a mixed bag of repetitive sequences and millions of short snippets called jumping genes, which accounted for 85% of the fern’s DNA. Rather than multiple genome copies, Ceratopteris mostly contains genetic debris accumulated over millions of years. 

“The functional genes are separated by large amounts of repetitive DNA. And although we’re not yet sure how the Ceratopteris and other fern genomes got so big, it’s clear that the prevailing view of repeated episodes of genome duplication is not supported,” said co-author Pam Soltis, a Florida Museum curator and distinguished professor. 

The authors note that it’s too early to make any firm conclusions, especially since this is the first analysis of its scope conducted in this group. Cross comparisons with additional fern genomes down the road will help paint a clearer picture of how these plants evolved.

Still, the results point to a clear difference in the way homosporous ferns manage their genetic content compared to almost all other plants, Marchant said.

“What we seem to be finding is that things like flowering plants, which on average have much smaller genomes than ferns, are just better at getting rid of junk DNA. They’re better at dropping spare chromosomes and even downsizing after small duplications.”

Ferns repeatedly stole toxins from bacteria

A closer look at the billions of DNA base pairs within Ceratopteris revealed multiple defense genes that code for a particularly sinister type of pore-forming toxin. These toxins bind to cells, where they become activated and form small, hollow rings that punch their way into the cellular membrane. Water floods into the cells through the resulting holes, causing them to rupture.

Pore-forming toxins have been intensively studied by scientists for their potential use in nanopore technology, Marchant explained. Most often, however, they’re found in bacteria.

“This is the first concrete evidence of these bacterial toxin-related genes within fern DNA,” Marchant said, noting that the similarity isn’t a coincidence.

Rather than evolving this toxin on its own, Ceratopteris appears to have obtained it directly from bacteria through a process called horizontal gene transfer. And given that there were multiple copies of the gene spread out among three separate chromosomes, it’s likely this happened more than once. 

“What’s fascinating is that the many copies of these genes show up in different parts of the plant,” he said. “Some are highly expressed in the stem and roots, while other copies are expressed solely in the leaves, and others are generally expressed across all tissues. We cannot be sure of the exact function of these genes at this point, but their similarity to the toxin-forming genes in bacteria certainly suggests these genes are defense-related.” 

This wouldn’t be the first time ferns have incorporated foreign DNA into their genomes. A 2014 study indicates ferns may have evolved their characteristic ability to grow in shady environments by borrowing genes from distantly related plants.

However, exactly how organisms separated by millions of years of evolution are able to swap fully functional genes remains unclear. 

“The mechanisms behind horizontal gene transfer remain one of the least investigated areas of land plant evolution,” Doug Soltis explained. “Over evolutionary timescales, it’s a bit like winning the lottery. Any time a plant is wounded, its interior is susceptible to invasion from microbes, but for their DNA to be incorporated into the genome seems amazing.”

The authors say this is merely the first step in a long series of studies with practical applications ranging from the development of novel biopesticides to innovative new conservation strategies. 

Several of the authors are involved in the current effort to sequence the genomes of all known eukaryotic organisms within a 10-year time frame. Called the Earth Biogenome Project, the endeavor will generate untold genomic resources that researchers will have their hands full analyzing for the foreseeable future.  

Researchers propose new framework for regulating engineered crops

Peer-Reviewed Publication

NORTH CAROLINA STATE UNIVERSITY


Researchers Propose New Framework for Regulating Engineered Crops 

IMAGE: SAFETY TESTING WOULD BE RECOMMENDED FOR PRODUCTS WITH NEW CHARACTERISTICS THAT HAVE THE POTENTIAL FOR HEALTH OR ENVIRONMENTAL EFFECTS, OR FOR PRODUCTS WITH DIFFERENCES THAT CANNOT BE INTERPRETED, MOST NEW VARIETIES WOULD NOT TRIGGER A NEED FOR REGULATION. view more 

CREDIT: NC STATE UNIVERSITY

 THURSDAY, SEPT. 1

A Policy Forum article published today in Science calls for a new approach to regulating genetically engineered (GE) crops, arguing that current approaches for triggering safety testing vary dramatically among countries and generally lack scientific merit – particularly as advances in crop breeding have blurred the lines between conventional breeding and genetic engineering.

Rather than focusing on the methods and processes behind the creation of a GE crop to determine if testing is needed, a more effective framework would examine the specific new characteristics of the crop itself by using so-called “-omics” approaches, the article asserts. In the same way that biomedical sciences can use genomic approaches to scan human genomes for problematic mutations, genomics can be used to scan new crop varieties for unexpected DNA changes. 

Additional “-omics” methods such as transcriptomics, proteomics, epigenomics and metabolomics test for other changes to the molecular composition of plants. These measurements of thousands of molecular traits can be used like a fingerprint to determine whether the product from a new variety is “substantially equivalent” to products already being produced by existing varieties – whether, for example, a new peach variety has molecular characteristics that are already found in one or more existing commercial peach varieties. 

If the new product has either no differences or understood differences with no expected health or environmental effects when compared with products of existing varieties, no safety testing would be recommended, the article suggests. If, however, the product has new characteristics that have the potential for health or environmental effects, or if the product has differences that cannot be interpreted, safety testing would be recommended.

“The approaches used right now – which differ among governments – lack scientific rigor,” said Fred Gould, University Distinguished Professor at North Carolina State University, co-director of NC State’s Genetic Engineering and Society Center and the corresponding author of the article. “The size of the change made to a product and the origin of the DNA have little relationship with the results of that change; changing one base pair of DNA in a crop with 2.5 billion base pairs, like corn, can make a substantial difference.”

When dealing with varieties made using the powerful gene editing system known as CRISPR, for example, the European Union regulates all varieties while other governments base decisions on the size of the genetic change and the source of inserted genetic material. Meanwhile, in 2020 the U.S. Department of Agriculture established a rule that exempts from regulation conventionally bred crop varieties and GE crop varieties that could have been developed by methods other than genetic engineering.

The “-omics” approaches, if used appropriately, would not increase the cost of regulation, Gould said, adding that most new varieties would not trigger a need for regulation. 

“The most important question is, ‘Does the new variety have unfamiliar characteristics,’” Gould said. The paper estimates that technological advances could make the laboratory cost for a set of “-omics” tests decrease to about $5,000 within five to 10 years. 

Establishing an international committee composed of crop breeders, chemists and molecular biologists to establish the options and costs of “-omics” approaches for a variety of crops would start the process of developing this new regulatory framework. Workshops with these experts as well as sociologists, policymakers, regulators and representatives of the general public would enable trustworthy deliberations that could avoid some of the problems encountered when GE rolled out in the 1990s. National and international governing bodies should sponsor these committees and workshops as well as innovative research to get the ball rolling and ensure that assessments are accessible and accurate, Gould said.

In 2016, Gould headed a 20-member National Academy of Sciences committee responsible for a report, Genetically Engineered Crops: Experiences and Prospects, which aimed to “assess the evidence for purported negative effects of GE crops and their accompanying technologies” and to “assess the evidence for purported benefits of GE crops and their accompanying technologies.” Most of that committee co-authored the policy article published this week.

- kulikowski -



A sustainable battery with a biodegradable electrolyte made from crab shells

Peer-Reviewed Publication

CELL PRESS

Crab and shrimp shells are an abundant source of chitin 

IMAGE: CRAB AND SHRIMP SHELLS ARE AN ABUNDANT SOURCE OF CHITIN view more 

CREDIT: LIANGBING HU

Accelerating demand for renewable energy and electric vehicles is sparking a high demand for the batteries that store generated energy and power engines. But the batteries behind these sustainability solutions aren’t always sustainable themselves. In a paper publishing September 1 in the journal Matter, scientists create a zinc battery with a biodegradable electrolyte from an unexpected source—crab shells.

“Vast quantities of batteries are being produced and consumed, raising the possibility of environmental problems,” says lead author Liangbing Hu, director of the University of Maryland’s Center for Materials Innovation. “For example, polypropylene and polycarbonate separators, which are widely used in Lithium-ion batteries, take hundreds or thousands of years to degrade and add to environmental burden.”

Batteries use an electrolyte to shuttle ions back and forth between positively and negatively charged terminals. An electrolyte can be a liquid, paste, or gel, and many batteries use flammable or corrosive chemicals for this function. This new battery, which could store power from large-scale wind and solar sources, uses a gel electrolyte made from a biological material called chitosan.

“Chitosan is a derivative product of chitin. Chitin has a lot of sources, including the cell walls of fungi, the exoskeletons of crustaceans, and squid pens,” says Hu. “The most abundant source of chitosan is the exoskeletons of crustaceans, including crabs, shrimps and lobsters, which can be easily obtained from seafood waste. You can find it on your table.”

A biodegradable electrolyte means that about two thirds of the battery could be broken down by microbes—this chitosan electrolyte broke down completely within five months. This leaves behind the metal component, in this case zinc, rather than lead or lithium, which could be recycled.

“Zinc is more abundant in earth’s crust than lithium,” says Hu. “Generally speaking, well-developed zinc batteries are cheaper and safer.” This zinc and chitosan battery has an energy efficiency of 99.7% after 1000 battery cycles, making it a viable option for storing energy generated by wind and solar for transfer to power grids.

Hu and his team hope to continue working on making batteries even more environmentally friendly, including the manufacturing process. “In the future, I hope all components in batteries are biodegradable,” says Hu. “Not only the material itself but also the fabrication process of biomaterials.”

###

This work was supported by the Research Corporation for Science Advancement, Facebook Reality Labs Research, the University of Maryland A. James Clark School of Engineering and Maryland Nanocenter, and AIMLab.

Matter, Hu et al. “A sustainable chitosan-zinc electrolyte for high-rate zinc metal batteries” https://www.cell.com/matter/fulltext/S2590-2385(22)00414-3

Matter (@Matter_CP), published by Cell Press, is a new journal for multi-disciplinary, transformative materials sciences research. Papers explore scientific advancements across the spectrum of materials development—from fundamentals to application, from nano to macro. Visit: https://www.cell.com/matter. To receive Cell Press media alerts, please contact press@cell.com.

Disclaimer: AAAS and EurekAle

Social cost of carbon more than triple the current federal estimate, new study finds

A multi-year study of the social cost of carbon, a critical input for climate policy analysis, finds that every additional ton of carbon dioxide emitted into the atmosphere costs society $185—far higher than the current federal estimate of $51 per ton.


Peer-Reviewed Publication

RESOURCES FOR THE FUTURE (RFF)

After years of robust modeling and analysis, a multi-institutional team led by researchers from Resources for the Future (RFF) and the University of California, Berkeley (UC Berkeley), has released an updated social cost of carbon estimate that reflects new methodologies and key scientific advancements. The study, published today in the journal Nature, finds that each additional ton of carbon dioxide emitted into the atmosphere costs society $185 per ton—3.6 times the current US federal estimate of $51 per ton.

The social cost of carbon is a critical metric that measures the economic damages, in dollars, that result from the emission of one additional ton of carbon dioxide into the atmosphere. A high social cost of carbon can motivate more stringent climate policies, as it increases the estimated benefits of reducing greenhouse gases.

“Our estimate, which draws on recent advances in the scientific and economic literature, shows that we are vastly underestimating the harm of each additional ton of carbon dioxide that we release into the atmosphere,” said RFF President and CEO Richard G. Newell, who coauthored the peer-reviewed paper. “The implication is that the benefits of government policies and other actions that reduce global warming pollution are greater than has been assumed.”

The study, led by UC Berkeley Associate Professor David Anthoff and RFF Fellow Kevin Rennert, brought together leading researchers from institutions across the United States to develop important updates to social cost of carbon modeling. These advances include consideration of the probability of different socioeconomic and emissions trajectories far into the future; the incorporation of a modern representation of the climate system; and state-of-the-art scientific methodologies for assessing the effects of climate change on agriculture, temperature-related deaths, energy expenditures, and sea-level rise. The estimate also takes into account an updated approach to evaluating future climate risks through ‘discounting’ that is linked to future economic uncertainty. The $185-per-ton value is the central estimate of many that includes the inherent uncertainty in these trajectories.

Notably, the new Nature study is fully responsive to the methodological recommendations of a seminal 2017 National Academies report co-chaired by Newell and RFF’s Maureen Cropper. A federal interagency working group on the social costs of greenhouse gases, disbanded during the previous administration but reestablished by an executive order from President Biden, is also updating its social cost of carbon estimate using the 2017 recommendations.

“We hope that our research helps inform the anticipated updated social cost of carbon from the government’s interagency working group,” said Brian C. Prest, coauthor and director of RFF’s Social Cost of Carbon Initiative. “Decisions are only as strong as the science behind them. And our study finds that carbon dioxide emissions are more costly to society than many people likely realize.”

Aside from the estimate itself, a major output of the study is the Greenhouse Gas Impact Value Estimator (GIVE) model, an open-source software platform that allows users to replicate the team’s methodology or compute their own social cost of carbon estimates. Also released today is a new data tool, the Social Cost of Carbon Explorer, which demonstrates the working mechanics of the GIVE model and allows users to explore the data in detail.

“Our hope is that the freely available, open-source GIVE model we’re introducing today forms the foundation for continuous improvement of the estimates by an expanded community of scientists worldwide,” Rennert said. “A completely transparent methodology has been a guiding principle for our work, which is also directly relevant to other greenhouse gases, such as methane and nitrous oxides.”

Anthoff emphasized that the diverse expertise of the paper’s authors stems from the multi-faceted nature of the research. “Estimating the social cost of carbon requires inputs from many academic disciplines,” he said. “When we started this project, we knew that we would only succeed by assembling a team of leading researchers in each discipline to contribute their expertise. I am especially proud of the all-star group of researchers across so many leading institutions that jointly worked on this paper.”

For more, read the new paper, “Comprehensive Evidence Implies a Higher Social Cost of CO₂,” by Kevin Rennert (RFF), Frank Errickson (Princeton University), Brian C. Prest (RFF), Lisa Rennels (UC Berkeley), Richard G. Newell (RFF), Billy Pizer (RFF), Cora Kingdon (UC Berkeley), Jordan Wingenroth (RFF), Roger Cooke (RFF), Bryan Parthum (US Environmental Protection Agency), David Smith (US Environmental Protection Agency), Kevin Cromar (New York University), Delavane Diaz (EPRI), Frances C. Moore (University of California, Davis), Ulrich K. Müller (Princeton University), Richard J. Plevin, Adrian E. Raftery (University of Washington), Hana Ševčíková (University of Washington), Hannah Sheets (Rochester Institute of Technology), James H. Stock (Harvard University), Tammy Tan (US Environmental Protection Agency), Mark Watson (Princeton University), Tony E. Wong (Rochester Institute of Technology), and David Anthoff (UC Berkeley).

For more information on the paper, read the related blog post, “The Social Cost of Carbon: Reaching a New Estimate.”

How ‘prediction markets’ could improve climate risk policies and investment decisions


A market-led approach could be key to guiding policy, research and business decisions about future climate risks, a new study outlines


Peer-Reviewed Publication

LANCASTER UNIVERSITY

A market-led approach could be key to guiding policy, research and business decisions about future climate risks, a new study outlines.

Published in the journal Nature Climate Change, the paper from academics at the Universities of Lancaster and Exeter details how expert ‘prediction markets’ could improve the climate-risk forecasts that guide key business and regulatory decisions.

Organisations now appreciate that they have to consider climate risks within their strategic plans – whether that relates to physical risks to buildings and sites, or risks associated with transitioning to achieve net zero.

However, the forward-looking information needed to inform these strategic decisions is limited, the researchers say.

Dr Kim Kaivanto, a co-author from Lancaster University’s Department of Economics, said: “The institutional arrangements under which climate-risk information is currently provided mirrors the incentive problems and conflicts of interest that prevailed in the credit-rating industry prior to the 2007/8 financial crisis.

 “In order to make sense of emissions scenarios and to support planning and decision-making, organisations have a pressing need for this type of forward-looking expert risk information.

“Understanding climate risks requires diverse and complementary expertise from political science, economics and policy, as well as country-specific knowledge on the major emitters. Prediction markets incentivise and reward participants with distinct expertise and information to come forward – and they offer a level playing field for experts from these complementary fields of expertise.”

Mark Roulston, one of the Exeter University co-authors said, “If providers of climate forecasts are paid upfront irrespective of accuracy, you don’t need to be an economist to spot the problem with that arrangement.”

In their paper, ‘Prediction-market innovations can improve climate-risk forecasts’ the authors detail how expert ‘prediction markets’ can help overcome the structural problems and shortfalls in the provision of forward-looking climate-risk information – something that will become more vital as the demand for long-range climate information increases.

Prediction markets are designed to incentivise those with important information to come forward, and facilitate the aggregation of information through the buying and selling of contracts that yield a fixed payoff if the specified event occurs. An outcome of interest – such as average CO2 concentration in the year 2040, for example – is partitioned into intervals. Expert participants compare the results of their own modelling with the prices of these intervals, and purchase or sell claims on these intervals if their model suggests the price is too low or too high.

With a well-designed market such as Lancaster University’s AGORA prediction-market platform, the price of a contract can be interpreted as the market-based probability of the event happening.

These kinds of long-range markets have not been established to date due, in part, to regulatory obstacles. However, the researchers believe the markets can be designed to overcome these obstacles by avoiding the ‘pay-to-play’ aspect of existing prediction markets in which the losses of less-well-informed individuals fund the winnings of better-informed individuals. Instead, markets can be structured as vehicles for distributing research funding to experts and modellers in a manner that is consistent with the principles of effective altruism: an initial stake provided by a sponsor is distributed to participants in accordance with the quality and quantity of information they bring into the market through their trading activity.

They add that access to participation in the markets would need to have selection criteria to ensure diversity of views and a range of expertise to ensure they are able to aggregate diverse sources of information.

The paper’s authors are Kim Kaivanto of Lancaster University, and Mark Roulston, Todd Kaplan and Brett Day of the University of Exeter.

Will Paris succeed? Research assesses if governments will make pledges a reality

UC San Diego School of Global Policy and Strategy finds that American credibility on climate change is lagging behind other regions, especially Europe

Peer-Reviewed Publication

UNIVERSITY OF CALIFORNIA - SAN DIEGO

Figure 1 

IMAGE: SURVEYS WHERE RESPONDENTS WERE ASKED TO RATE THEIR HOME COUNTRY WERE CATEGORIZED BY CONTINENT TO ELICIT THE MOST CANDID RESPONSES POSSIBLE. IN THIS ANALYSIS, EXPERTS FROM NORTH AMERICAN COUNTRIES WERE THE MOST PESSIMISTIC ABOUT THEIR PLEDGES, BOTH IN THEIR DRIVE AND ABILITY TO ACHIEVE CLIMATE GOALS IN THE AGREEMENT. view more 

CREDIT: UC SAN DIEGO'S SCHOOL OF GLOBAL POLICY AND STRATEGY

Much of the world’s efforts to mitigate the effects of climate change hinge on the success of the landmark 2015 Paris Agreement. A new Nature Climate Change study is the first to provide scientific evidence assessing how effective governments will be at implementing their commitments to the agreement that will reduce CO2 emissions causing climate change.  

The research reveals that the countries with the boldest pledges are also the most likely to achieve their goals. Europe takes the lead with the strongest commitments that are also the most credible; however, findings suggest the U.S., despite having a less ambitious commitment under Paris, is not expected to meet its pledges. 

The study from the University of California San Diego’s School of Global Policy and Strategy integrates a novel sample of registrants of the Conference of Parties (COP), consisting of more than 800 diplomatic and scientific experts who, for decades, have participated in climate policy debates. This expert group was important to survey because they are the people “in the room” when key policy decisions are made and therefore in a unique position to evaluate what their countries and other countries are likely to achieve. 

They were asked to rate member nations—their own country included—to gauge pledge ambition, which is how much each country has pledged to do to mitigate global warming, in comparison to what they feasibly could do, given their economic strength, to avert a climate crisis. They also were asked to evaluate the degree to which nations have pledges that are credible.  

“The pledges outlined in the accords are legally non-binding, thus the success of the agreement centers around confidence in the system that when governments make promises, they are going to live up those promises,” said the study’s lead author David Victor, professor of industrial innovation at UC San Diego’s School of Global Policy and Strategy and co-director of the Deep Decarbonization Initiative.

Victor added, “Our results indicate that the framework of the agreement is working pretty well. The Paris Agreement is getting countries to make ambitious pledges; last year nearly all countries updated those pledges and made them even more ambitious. What’s needed next is better systems for checking to see whether countries are actually delivering what they promise.”

A subset of survey responses from eight countries plus the EU were selected for being most relevant to climate mitigation policy. They rate Europe’s goals as the most ambitious and credible. Europe is followed by China, Australia, South Africa and India. The U.S. and Brazil come in last place in the credibility category and second to last, after Saudi Arabia, in terms of ambition.

Surveys where respondents were asked to rate their home country were categorized by continent to elicit the most candid responses possible. In this analysis, experts from North American countries were the most pessimistic about their pledges, both in their drive and ability to achieve climate goals in the agreement.

CAPTION

A subset of survey responses from eight countries plus the EU were selected for being most relevant to climate mitigation policy. They rate Europe’s goals as the most ambitious and credible. Europe is followed by China, Australia, South Africa and India. The U.S. and Brazil come in last place in the credibility category and second to last, after Saudi Arabia, in terms of ambition.

CREDIT

UC San Diego's School of Global Policy and Strategy



Study data incorporates judgement, intuition and experience of climate policy experts

“The benefit of this data set is that diplomatic and scientific experts have the best working knowledge about political and administrative realities of their home country,” Victor said. “It is difficult to get empirical information on national laws and regulations and climate change policy in particular is highly complex. To truly gauge the success of the Paris Agreement, you need to incorporate the judgement, intuition and expertise from those with real-world experience negotiating these policies.”

He added, “from all the responses, it’s clear the U.S. is clearly in trouble—even with the recent Inflation Reduction Act being signed into law, which happened after our study ended. While the legislation is a big step in the right direction, it doesn’t deliver the same investment many other counties have already committed. I think the major questions our study raises are ‘how does the U.S. boost its credibility’ and ‘why is credibility a problem.’”

Victor, also a nonresident senior fellow at the Brookings Institution, and co-authors did a statistical analysis of the data set and found nations with more stable governments are more likely to have bold pledges that are highly credible.  

The authors find China and other non-democracies are expected to comply with their pledges not simply because many of them have less ambitious pledges, but because they also have administrative and political systems that make it easier to implement complex national policies needed to align their countries with international commitments. In addition, China is on track to achieve its goals due to the country’s economic downturn.

The rationale that leading policy experts cite for why their countries are making and honoring their pledges varies a lot. For the wealthier countries, the key rationale is climate change. For most of the rest of the world—including the developing countries that are most vulnerable to climate change—experts cite the need to address air pollution and opportunities to grow their economies through climate action as a major driver. 

The UC San Diego contribution to this study is part of the university’s Deep Decarbonization Initiative. The other authors on the paper are Marcel Lumkowsky and Astrid Dannenberg, both of the University of Kassel. Dannenberg is also affiliated with the University of Gothenburg

The study “Determining the credibility of commitments in international climate policy” published in Nature Climate Change, can be found on this link

New approach predicts disease transmission among wildlife and humans

Using machine learning, researchers can forecast outbreaks of pathogens such as coronavirus and monkeypox

Peer-Reviewed Publication

UNIVERSITY OF SOUTH FLORIDA

TAMPA, Fla. (Sept. 1, 2022) – The rate that emerging wildlife diseases infect humans has steadily increased over the last three decades. Viruses, such as the global coronavirus pandemic and recent monkeypox outbreak, have heightened the urgent need for disease ecology tools to forecast when and where disease outbreaks are likely.

A University of South Florida assistant professor helped develop a methodology that will do just that – predict disease transmission from wildlife to humans, from one wildlife species to another and determine who is at risk of infection.

The methodology is a machine-learning approach that identifies the influence of variables, such as location and climate, on known pathogens. Using only small amounts of information, the system is able to identify community hot spots at risk of infection on both global and local scales.

“Our main goal is to develop this tool for preventive measures,” said co-principal investigator Diego Santiago-Alarcon, a USF assistant professor of integrative biology. “It’s difficult to have an all-purpose methodology that can be used to predict infections across all the diverse parasite systems, but with this research, we contribute to achieving that goal.”

With help from researchers at the Universiad Veracruzana and Instituto de Ecologia, located in Mexico, Santiago-Alarcon examined three host-pathogen systems – avian malaria, birds with West Nile virus and bats with coronavirus – to test the reliability and accuracy of the models generated by the methodology.

The team found that for the three systems, the species most frequently infected was not necessarily the most susceptible to the disease. To better pinpoint hosts with higher risk of infection, it was important to identify relevant factors, such as climate and evolutionary relationships.

By integrating geographic, environmental and evolutionary development variables, the researchers identified host species that have previously not been recorded as infected by the parasite under study, providing a way to identify susceptible species and eventually mitigate pathogen risk.

“We feel confident that the methodology is successful, and it can be applied widely to many host-pathogen systems,” Santiago-Alarcon said. “We now enter into a phase of improvement and refinement.”

The results, published in the Proceedings of the National Academy of Sciences, prove the methodology is able to provide reliable global predictions for the studied host–pathogen systems, even when using a small amount of information. This new approach will help direct infectious disease surveillance and field efforts, providing a cost-effective strategy to better determine where to invest limited disease resources.

Predicting what kind of pathogen will produce the next medical or veterinary infection is challenging, but necessary. As the rate of human impact on natural environments increases, opportunity for novel diseases will continue to rise.

“Humanity, and indeed biodiversity in general, are experiencing more and more infectious disease challenges as a result of our incursion and destruction of the natural order worldwide through things like deforestation, global trade and climate change,” said Andrés Lira-Noriega, research fellow at the Instituto de Ecologia. “This imposes the need of having tools like the one we are publishing to help us predict where new threats in terms of new pathogens and their reservoirs may occur or arise.”

The team plans to continue their research to further test the methodology on additional host-pathogen systems and extend the study of disease transmission to predict future outbreaks. The goal is to make the tool easily accessible through an app for the scientific community by the end of 2022.

About the University of South Florida

The University of South Florida, a high-impact global research university dedicated to student success, generates an annual economic impact of more than $6 billion. Over the past 10 years, no other public university in the country has risen faster in U.S. News and World Report’s national university rankings than USF. Serving more than 50,000 students on campuses in Tampa, St. Petersburg and Sarasota-Manatee, USF is designated as a Preeminent State Research University by the Florida Board of Governors, placing it in the most elite category among the state’s 12 public universities. USF has earned widespread national recognition for its success graduating under-represented minority and limited-income students at rates equal to or higher than white and higher income students. USF is a member of the American Athletic Conference. Learn more at www.usf.edu.