Friday, September 02, 2022

Ant queens control insulin to boost lifespan and reproduction

Peer-Reviewed Publication

UNIVERSITY OF FLORIDA

Ant worker 

IMAGE: A WORKER HARPEGNATHOS SALTATOR ANT. WORKERS CAN BECOME PSEUDO-QUEENS AND EXTEND THEIR LIFESPAN BY YEARS. view more 

CREDIT: HUA YAN

In most of the animal world, there’s a sad trade-off to make: The more babies you have, the shorter you live.

But ants buck the system. The queens — the only individuals in a nest that reproduce — also live five, 10, up to 30 times longer than their genetically identical worker sisters. How do they pull off what the rest of animalkind cannot?

A new study from University of Florida biologist Hua Yan and his colleagues at New York University finds that ant queens implement a dual-control system for insulin, the metabolism-controlling hormone that explains much of the trade-off between reproduction and lifespan. Queens massively boost their insulin production, which promotes egg development. But their ovaries also produce an insulin blocker that slows down the aging process.

“Hopefully this finding allows us to better understand the aging process in many animals,” said Yan, an assistant professor of biology at UF who also studies how ants communicate with pheromones to organize their society.

Whether mammals, including humans, could ever benefit from partially blocking the insulin pathway remains an open question. Calorie restriction, which decreases insulin production, can increase lifespan in mammals but hurts reproduction.

The research team published their work on Sept. 1 in Science. Yan and NYU researchers Comzit Opachaloemphan and Francisco Carmona-Aldana led the study, which was supervised by NYU professors Claude Desplan and Danny Reinberg.

They studied Harpegnathos saltator ants, also known as Indian jumping ants, because of a helpful transformation the ants undergo. When a queen dies, the remaining workers duel to decide which ants will become new pseudo-queens capable of laying eggs. The pseudo-queens acquire longer lifespans, but the process is also reversible if they encounter a true queen. That gave Yan and the research team the perfect system to study how lifespan extension can be switched on and off.

They discovered that pseudo-queens produce much more insulin, which they expected. Insulin helps convert food into energy, and reproduction is an energy-intensive process.

“It’s straightforward, the pseudo-queen is reproductive, so they need insulin. But insulin normally shortens lifespan, yet they have much longer lifespan – why?” Yan pondered. “There must be something in the insulin signaling of the ants that differentially regulates reproduction and longevity.”

The research team found this extra layer of control in the form of an insulin blocker, called Imp-L2, which is produced by the newly active ovaries of the pseudo-queen. This insulin blocker slows down the part of the insulin pathway normally responsible for accelerating the aging process, but leaves the reproduction-boosting side of insulin signaling intact.

In a sense, the ants get to have their cake and eat it too, coupling egg-laying with a long life.

High plant diversity is often found in the smallest of areas

Peer-Reviewed Publication

MARTIN-LUTHER-UNIVERSITÄT HALLE-WITTENBERG

Meadow in Romania 

IMAGE: THIS MEADOW IN ROMANIA IS ONE OF THE MOST SPECIES-RICH REGIONS ON EARTH - IN 2009, A RESEARCH TEAM FOUND 98 PLANT SPECIES HERE. view more 

CREDIT: JÜRGEN DENGLER

It might sound weird, but it's true: the steppes of Eastern Europe are home to a similar number of plant species as the regions of the Amazon rainforest. However, this is only apparent when species are counted in small sampling areas, rather than hectares of land. An international team of researchers led by the Martin Luther University Halle-Wittenberg (MLU) and the German Centre for Integrative Biodiversity Research (iDiv) Halle-Jena-Leipzig has now shown how much estimates of plant diversity change when the sampling area ranges from a few square metres to hectares. Their results have been published in the journal Nature Communications and could be utilised in new, more tailored nature conservation concepts.

In their study, the team analysed a dataset of around 170,000 vegetation plots from all of the Earth’s climate zones. The data included information on all of the plant species found at a location and the coordinates of the respective area under study. The data was taken from the globally unique vegetation database "sPlot", which is located at iDiv.

"Most studies on global biodiversity are conducted on a relatively large scale, for example at a state or provincial scale. We wanted to find out how much results differ when smaller areas are examined," says Professor Helge Bruelheide from MLU. The team used artificial intelligence to investigate, among other things, the relationship between the number of plant species and the size of the area under study. 

Their investigation showed that there are regions on Earth where focusing on large study areas only provide a limited understanding of the distribution of biodiversity: sometimes small areas can have a relatively high biodiversity, for example in the steppes of Eastern Europe, in Siberia and in the Alpine countries of Europe. At fine spatial scales, the large difference in biodiversity between the tropics, like the Amazon, and the temperate climate zones nearly disappears. 

The same applies to the African tropics, which were previously considered an exception in the tropical plant world. "The tropics have always been among the most biodiverse areas in the world. We wondered why this shouldn’t also apply to Western Africa," explains Dr Francesco Maria Sabatini, who led the study at MLU and is now an assistant professor at the University of Bologna. In fact, the distribution of plant species varies greatly in the African tropics, says Sabatini. These species are distributed over very large distances, so that they are not always recorded when a small sampling area is examined. "To correctly recognize the high biodiversity in Western Africa many small areas are required," adds Sabatini. 

The study also shows that the spatial scale at which other very biodiverse areas are examined, such as the Cerrado savanna region in Brazil or regions in Southeast Asia, is irrelevant. These results are also important when it comes to protecting species. "Ecosystems whose high biodiversity is spread out over a large area cannot be protected through the traditional patchwork of nature reserves. In contrast, ecosystems that have a high biodiversity within a small area could benefit well from several distinct protected zones," concludes Bruelheide.  

The study was funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation).

Study: Sabatini F. M. et al. Global patterns of vascular plant alpha diversity. Nature Communications (2022). doi: 10.1038/s41467-022-32063-z

Ferns finally get a genome, revealing a history of DNA hoarding and kleptomania

Peer-Reviewed Publication

FLORIDA MUSEUM OF NATURAL HISTORY

Ceratopteris 

IMAGE: ANALYSIS OF THE CERATOPTERIS GENOME PROVIDES HINTS FOR SOLVING THE LONG-STANDING MYSTERY OF WHY FERNS, ON AVERAGE, RETAIN MORE DNA THAN OTHER PLANTS. COMPARISONS TO GENOMES FROM OTHER GROUPS ALSO LED TO THE SURPRISE DISCOVERY THAT FERNS STOLE THE GENES FOR SEVERAL OF THEIR ANTI-HERBIVORY TOXINS FROM BACTERIA. view more 

CREDIT: PHOTO BY DAVID RANDALL, WESTERN SYDNEY UNIVERSITY

Ferns are notorious for containing massive amounts of DNA and an excessively large number of chromosomes. Defying all expectations, a fern no larger than a dinner plate currently holds the title for highest chromosome count, with a whopping 720 pairs crammed into each of its nuclei. This penchant of ferns for hoarding DNA has stumped scientists, and the intractable size of their genomes has made it difficult to sequence, assemble and interpret them.

Now, two papers published in the journal Nature Plants are rewriting history with the first full-length genomes for homosporous ferns, a large group that contains 99% of all modern fern diversity.

“Every genome tells a different story,” said co-author Doug Soltis, a distinguished professor with the Florida Museum of Natural History. "Ferns are the closest living relatives of all seed plants, and they produce chemical deterrents to herbivores that may be useful for agricultural research. Yet until now, they've remained the last major lineage of green life without a genome sequence.” 

Two teams of researchers separately unveiled the genome of Ceratopteris (Ceratopteris richardii) this Thursday and that of the flying spider monkey tree fern (Alsophila spinulosa) last month.

Analysis of the Ceratopteris genome provides hints for solving the long-standing mystery of why ferns, on average, retain more DNA than other plants. Comparisons to genomes from other groups also led to the surprise discovery that ferns stole the genes for several of their anti-herbivory toxins from bacteria.

The Ceratopteris genome bucks a decades-old theory, leaving more questions than answers

Since the 1960s, the most favored explanation for why ferns contain so much DNA invoked rampant whole-genome duplications, in which an extra set of chromosomes is accidentally passed on to an organism’s offspring. This can sometimes be beneficial, as all the extra genes can then be used as raw material for the evolution of new traits. In fact, whole-genome duplication has been implicated in the origin of nearly all crop plants. 

Whole-genome duplication is common in plants and even some animals, but most organisms tend to jettison the extra genetic baggage over time, slimming back down to smaller genomes that are metabolically easier to maintain.

“This has been a major point of discussion for the last half-century and has led to all kinds of conflicting results,” said lead author Blaine Marchant, a postdoctoral scholar at Stanford University and former Florida Museum graduate student. “Trying to figure out the evolutionary process underlying this paradox is incredibly important.”

With the first fully assembled homosporous fern genomes, scientists were finally prepared to address this question, but getting there wasn’t easy. Sequencing the large, complex genome of Ceratopteris took over eight years of work and the combined effort of dozens of researchers from 28 institutions around the world, including the U.S. Department of Energy Joint Genome Institute. The final result was 7.46 gigabases of DNA, more than double the size of the human genome.

If Ceratopteris had bulked up on DNA through repeated genome duplication events, researchers expected large portions of its 39 chromosome pairs would be identical. What they found instead was a mixed bag of repetitive sequences and millions of short snippets called jumping genes, which accounted for 85% of the fern’s DNA. Rather than multiple genome copies, Ceratopteris mostly contains genetic debris accumulated over millions of years. 

“The functional genes are separated by large amounts of repetitive DNA. And although we’re not yet sure how the Ceratopteris and other fern genomes got so big, it’s clear that the prevailing view of repeated episodes of genome duplication is not supported,” said co-author Pam Soltis, a Florida Museum curator and distinguished professor. 

The authors note that it’s too early to make any firm conclusions, especially since this is the first analysis of its scope conducted in this group. Cross comparisons with additional fern genomes down the road will help paint a clearer picture of how these plants evolved.

Still, the results point to a clear difference in the way homosporous ferns manage their genetic content compared to almost all other plants, Marchant said.

“What we seem to be finding is that things like flowering plants, which on average have much smaller genomes than ferns, are just better at getting rid of junk DNA. They’re better at dropping spare chromosomes and even downsizing after small duplications.”

Ferns repeatedly stole toxins from bacteria

A closer look at the billions of DNA base pairs within Ceratopteris revealed multiple defense genes that code for a particularly sinister type of pore-forming toxin. These toxins bind to cells, where they become activated and form small, hollow rings that punch their way into the cellular membrane. Water floods into the cells through the resulting holes, causing them to rupture.

Pore-forming toxins have been intensively studied by scientists for their potential use in nanopore technology, Marchant explained. Most often, however, they’re found in bacteria.

“This is the first concrete evidence of these bacterial toxin-related genes within fern DNA,” Marchant said, noting that the similarity isn’t a coincidence.

Rather than evolving this toxin on its own, Ceratopteris appears to have obtained it directly from bacteria through a process called horizontal gene transfer. And given that there were multiple copies of the gene spread out among three separate chromosomes, it’s likely this happened more than once. 

“What’s fascinating is that the many copies of these genes show up in different parts of the plant,” he said. “Some are highly expressed in the stem and roots, while other copies are expressed solely in the leaves, and others are generally expressed across all tissues. We cannot be sure of the exact function of these genes at this point, but their similarity to the toxin-forming genes in bacteria certainly suggests these genes are defense-related.” 

This wouldn’t be the first time ferns have incorporated foreign DNA into their genomes. A 2014 study indicates ferns may have evolved their characteristic ability to grow in shady environments by borrowing genes from distantly related plants.

However, exactly how organisms separated by millions of years of evolution are able to swap fully functional genes remains unclear. 

“The mechanisms behind horizontal gene transfer remain one of the least investigated areas of land plant evolution,” Doug Soltis explained. “Over evolutionary timescales, it’s a bit like winning the lottery. Any time a plant is wounded, its interior is susceptible to invasion from microbes, but for their DNA to be incorporated into the genome seems amazing.”

The authors say this is merely the first step in a long series of studies with practical applications ranging from the development of novel biopesticides to innovative new conservation strategies. 

Several of the authors are involved in the current effort to sequence the genomes of all known eukaryotic organisms within a 10-year time frame. Called the Earth Biogenome Project, the endeavor will generate untold genomic resources that researchers will have their hands full analyzing for the foreseeable future.  

Researchers propose new framework for regulating engineered crops

Peer-Reviewed Publication

NORTH CAROLINA STATE UNIVERSITY


Researchers Propose New Framework for Regulating Engineered Crops 

IMAGE: SAFETY TESTING WOULD BE RECOMMENDED FOR PRODUCTS WITH NEW CHARACTERISTICS THAT HAVE THE POTENTIAL FOR HEALTH OR ENVIRONMENTAL EFFECTS, OR FOR PRODUCTS WITH DIFFERENCES THAT CANNOT BE INTERPRETED, MOST NEW VARIETIES WOULD NOT TRIGGER A NEED FOR REGULATION. view more 

CREDIT: NC STATE UNIVERSITY

 THURSDAY, SEPT. 1

A Policy Forum article published today in Science calls for a new approach to regulating genetically engineered (GE) crops, arguing that current approaches for triggering safety testing vary dramatically among countries and generally lack scientific merit – particularly as advances in crop breeding have blurred the lines between conventional breeding and genetic engineering.

Rather than focusing on the methods and processes behind the creation of a GE crop to determine if testing is needed, a more effective framework would examine the specific new characteristics of the crop itself by using so-called “-omics” approaches, the article asserts. In the same way that biomedical sciences can use genomic approaches to scan human genomes for problematic mutations, genomics can be used to scan new crop varieties for unexpected DNA changes. 

Additional “-omics” methods such as transcriptomics, proteomics, epigenomics and metabolomics test for other changes to the molecular composition of plants. These measurements of thousands of molecular traits can be used like a fingerprint to determine whether the product from a new variety is “substantially equivalent” to products already being produced by existing varieties – whether, for example, a new peach variety has molecular characteristics that are already found in one or more existing commercial peach varieties. 

If the new product has either no differences or understood differences with no expected health or environmental effects when compared with products of existing varieties, no safety testing would be recommended, the article suggests. If, however, the product has new characteristics that have the potential for health or environmental effects, or if the product has differences that cannot be interpreted, safety testing would be recommended.

“The approaches used right now – which differ among governments – lack scientific rigor,” said Fred Gould, University Distinguished Professor at North Carolina State University, co-director of NC State’s Genetic Engineering and Society Center and the corresponding author of the article. “The size of the change made to a product and the origin of the DNA have little relationship with the results of that change; changing one base pair of DNA in a crop with 2.5 billion base pairs, like corn, can make a substantial difference.”

When dealing with varieties made using the powerful gene editing system known as CRISPR, for example, the European Union regulates all varieties while other governments base decisions on the size of the genetic change and the source of inserted genetic material. Meanwhile, in 2020 the U.S. Department of Agriculture established a rule that exempts from regulation conventionally bred crop varieties and GE crop varieties that could have been developed by methods other than genetic engineering.

The “-omics” approaches, if used appropriately, would not increase the cost of regulation, Gould said, adding that most new varieties would not trigger a need for regulation. 

“The most important question is, ‘Does the new variety have unfamiliar characteristics,’” Gould said. The paper estimates that technological advances could make the laboratory cost for a set of “-omics” tests decrease to about $5,000 within five to 10 years. 

Establishing an international committee composed of crop breeders, chemists and molecular biologists to establish the options and costs of “-omics” approaches for a variety of crops would start the process of developing this new regulatory framework. Workshops with these experts as well as sociologists, policymakers, regulators and representatives of the general public would enable trustworthy deliberations that could avoid some of the problems encountered when GE rolled out in the 1990s. National and international governing bodies should sponsor these committees and workshops as well as innovative research to get the ball rolling and ensure that assessments are accessible and accurate, Gould said.

In 2016, Gould headed a 20-member National Academy of Sciences committee responsible for a report, Genetically Engineered Crops: Experiences and Prospects, which aimed to “assess the evidence for purported negative effects of GE crops and their accompanying technologies” and to “assess the evidence for purported benefits of GE crops and their accompanying technologies.” Most of that committee co-authored the policy article published this week.

- kulikowski -



A sustainable battery with a biodegradable electrolyte made from crab shells

Peer-Reviewed Publication

CELL PRESS

Crab and shrimp shells are an abundant source of chitin 

IMAGE: CRAB AND SHRIMP SHELLS ARE AN ABUNDANT SOURCE OF CHITIN view more 

CREDIT: LIANGBING HU

Accelerating demand for renewable energy and electric vehicles is sparking a high demand for the batteries that store generated energy and power engines. But the batteries behind these sustainability solutions aren’t always sustainable themselves. In a paper publishing September 1 in the journal Matter, scientists create a zinc battery with a biodegradable electrolyte from an unexpected source—crab shells.

“Vast quantities of batteries are being produced and consumed, raising the possibility of environmental problems,” says lead author Liangbing Hu, director of the University of Maryland’s Center for Materials Innovation. “For example, polypropylene and polycarbonate separators, which are widely used in Lithium-ion batteries, take hundreds or thousands of years to degrade and add to environmental burden.”

Batteries use an electrolyte to shuttle ions back and forth between positively and negatively charged terminals. An electrolyte can be a liquid, paste, or gel, and many batteries use flammable or corrosive chemicals for this function. This new battery, which could store power from large-scale wind and solar sources, uses a gel electrolyte made from a biological material called chitosan.

“Chitosan is a derivative product of chitin. Chitin has a lot of sources, including the cell walls of fungi, the exoskeletons of crustaceans, and squid pens,” says Hu. “The most abundant source of chitosan is the exoskeletons of crustaceans, including crabs, shrimps and lobsters, which can be easily obtained from seafood waste. You can find it on your table.”

A biodegradable electrolyte means that about two thirds of the battery could be broken down by microbes—this chitosan electrolyte broke down completely within five months. This leaves behind the metal component, in this case zinc, rather than lead or lithium, which could be recycled.

“Zinc is more abundant in earth’s crust than lithium,” says Hu. “Generally speaking, well-developed zinc batteries are cheaper and safer.” This zinc and chitosan battery has an energy efficiency of 99.7% after 1000 battery cycles, making it a viable option for storing energy generated by wind and solar for transfer to power grids.

Hu and his team hope to continue working on making batteries even more environmentally friendly, including the manufacturing process. “In the future, I hope all components in batteries are biodegradable,” says Hu. “Not only the material itself but also the fabrication process of biomaterials.”

###

This work was supported by the Research Corporation for Science Advancement, Facebook Reality Labs Research, the University of Maryland A. James Clark School of Engineering and Maryland Nanocenter, and AIMLab.

Matter, Hu et al. “A sustainable chitosan-zinc electrolyte for high-rate zinc metal batteries” https://www.cell.com/matter/fulltext/S2590-2385(22)00414-3

Matter (@Matter_CP), published by Cell Press, is a new journal for multi-disciplinary, transformative materials sciences research. Papers explore scientific advancements across the spectrum of materials development—from fundamentals to application, from nano to macro. Visit: https://www.cell.com/matter. To receive Cell Press media alerts, please contact press@cell.com.

Disclaimer: AAAS and EurekAle

Social cost of carbon more than triple the current federal estimate, new study finds

A multi-year study of the social cost of carbon, a critical input for climate policy analysis, finds that every additional ton of carbon dioxide emitted into the atmosphere costs society $185—far higher than the current federal estimate of $51 per ton.


Peer-Reviewed Publication

RESOURCES FOR THE FUTURE (RFF)

After years of robust modeling and analysis, a multi-institutional team led by researchers from Resources for the Future (RFF) and the University of California, Berkeley (UC Berkeley), has released an updated social cost of carbon estimate that reflects new methodologies and key scientific advancements. The study, published today in the journal Nature, finds that each additional ton of carbon dioxide emitted into the atmosphere costs society $185 per ton—3.6 times the current US federal estimate of $51 per ton.

The social cost of carbon is a critical metric that measures the economic damages, in dollars, that result from the emission of one additional ton of carbon dioxide into the atmosphere. A high social cost of carbon can motivate more stringent climate policies, as it increases the estimated benefits of reducing greenhouse gases.

“Our estimate, which draws on recent advances in the scientific and economic literature, shows that we are vastly underestimating the harm of each additional ton of carbon dioxide that we release into the atmosphere,” said RFF President and CEO Richard G. Newell, who coauthored the peer-reviewed paper. “The implication is that the benefits of government policies and other actions that reduce global warming pollution are greater than has been assumed.”

The study, led by UC Berkeley Associate Professor David Anthoff and RFF Fellow Kevin Rennert, brought together leading researchers from institutions across the United States to develop important updates to social cost of carbon modeling. These advances include consideration of the probability of different socioeconomic and emissions trajectories far into the future; the incorporation of a modern representation of the climate system; and state-of-the-art scientific methodologies for assessing the effects of climate change on agriculture, temperature-related deaths, energy expenditures, and sea-level rise. The estimate also takes into account an updated approach to evaluating future climate risks through ‘discounting’ that is linked to future economic uncertainty. The $185-per-ton value is the central estimate of many that includes the inherent uncertainty in these trajectories.

Notably, the new Nature study is fully responsive to the methodological recommendations of a seminal 2017 National Academies report co-chaired by Newell and RFF’s Maureen Cropper. A federal interagency working group on the social costs of greenhouse gases, disbanded during the previous administration but reestablished by an executive order from President Biden, is also updating its social cost of carbon estimate using the 2017 recommendations.

“We hope that our research helps inform the anticipated updated social cost of carbon from the government’s interagency working group,” said Brian C. Prest, coauthor and director of RFF’s Social Cost of Carbon Initiative. “Decisions are only as strong as the science behind them. And our study finds that carbon dioxide emissions are more costly to society than many people likely realize.”

Aside from the estimate itself, a major output of the study is the Greenhouse Gas Impact Value Estimator (GIVE) model, an open-source software platform that allows users to replicate the team’s methodology or compute their own social cost of carbon estimates. Also released today is a new data tool, the Social Cost of Carbon Explorer, which demonstrates the working mechanics of the GIVE model and allows users to explore the data in detail.

“Our hope is that the freely available, open-source GIVE model we’re introducing today forms the foundation for continuous improvement of the estimates by an expanded community of scientists worldwide,” Rennert said. “A completely transparent methodology has been a guiding principle for our work, which is also directly relevant to other greenhouse gases, such as methane and nitrous oxides.”

Anthoff emphasized that the diverse expertise of the paper’s authors stems from the multi-faceted nature of the research. “Estimating the social cost of carbon requires inputs from many academic disciplines,” he said. “When we started this project, we knew that we would only succeed by assembling a team of leading researchers in each discipline to contribute their expertise. I am especially proud of the all-star group of researchers across so many leading institutions that jointly worked on this paper.”

For more, read the new paper, “Comprehensive Evidence Implies a Higher Social Cost of CO₂,” by Kevin Rennert (RFF), Frank Errickson (Princeton University), Brian C. Prest (RFF), Lisa Rennels (UC Berkeley), Richard G. Newell (RFF), Billy Pizer (RFF), Cora Kingdon (UC Berkeley), Jordan Wingenroth (RFF), Roger Cooke (RFF), Bryan Parthum (US Environmental Protection Agency), David Smith (US Environmental Protection Agency), Kevin Cromar (New York University), Delavane Diaz (EPRI), Frances C. Moore (University of California, Davis), Ulrich K. Müller (Princeton University), Richard J. Plevin, Adrian E. Raftery (University of Washington), Hana Ševčíková (University of Washington), Hannah Sheets (Rochester Institute of Technology), James H. Stock (Harvard University), Tammy Tan (US Environmental Protection Agency), Mark Watson (Princeton University), Tony E. Wong (Rochester Institute of Technology), and David Anthoff (UC Berkeley).

For more information on the paper, read the related blog post, “The Social Cost of Carbon: Reaching a New Estimate.”

How ‘prediction markets’ could improve climate risk policies and investment decisions


A market-led approach could be key to guiding policy, research and business decisions about future climate risks, a new study outlines


Peer-Reviewed Publication

LANCASTER UNIVERSITY

A market-led approach could be key to guiding policy, research and business decisions about future climate risks, a new study outlines.

Published in the journal Nature Climate Change, the paper from academics at the Universities of Lancaster and Exeter details how expert ‘prediction markets’ could improve the climate-risk forecasts that guide key business and regulatory decisions.

Organisations now appreciate that they have to consider climate risks within their strategic plans – whether that relates to physical risks to buildings and sites, or risks associated with transitioning to achieve net zero.

However, the forward-looking information needed to inform these strategic decisions is limited, the researchers say.

Dr Kim Kaivanto, a co-author from Lancaster University’s Department of Economics, said: “The institutional arrangements under which climate-risk information is currently provided mirrors the incentive problems and conflicts of interest that prevailed in the credit-rating industry prior to the 2007/8 financial crisis.

 “In order to make sense of emissions scenarios and to support planning and decision-making, organisations have a pressing need for this type of forward-looking expert risk information.

“Understanding climate risks requires diverse and complementary expertise from political science, economics and policy, as well as country-specific knowledge on the major emitters. Prediction markets incentivise and reward participants with distinct expertise and information to come forward – and they offer a level playing field for experts from these complementary fields of expertise.”

Mark Roulston, one of the Exeter University co-authors said, “If providers of climate forecasts are paid upfront irrespective of accuracy, you don’t need to be an economist to spot the problem with that arrangement.”

In their paper, ‘Prediction-market innovations can improve climate-risk forecasts’ the authors detail how expert ‘prediction markets’ can help overcome the structural problems and shortfalls in the provision of forward-looking climate-risk information – something that will become more vital as the demand for long-range climate information increases.

Prediction markets are designed to incentivise those with important information to come forward, and facilitate the aggregation of information through the buying and selling of contracts that yield a fixed payoff if the specified event occurs. An outcome of interest – such as average CO2 concentration in the year 2040, for example – is partitioned into intervals. Expert participants compare the results of their own modelling with the prices of these intervals, and purchase or sell claims on these intervals if their model suggests the price is too low or too high.

With a well-designed market such as Lancaster University’s AGORA prediction-market platform, the price of a contract can be interpreted as the market-based probability of the event happening.

These kinds of long-range markets have not been established to date due, in part, to regulatory obstacles. However, the researchers believe the markets can be designed to overcome these obstacles by avoiding the ‘pay-to-play’ aspect of existing prediction markets in which the losses of less-well-informed individuals fund the winnings of better-informed individuals. Instead, markets can be structured as vehicles for distributing research funding to experts and modellers in a manner that is consistent with the principles of effective altruism: an initial stake provided by a sponsor is distributed to participants in accordance with the quality and quantity of information they bring into the market through their trading activity.

They add that access to participation in the markets would need to have selection criteria to ensure diversity of views and a range of expertise to ensure they are able to aggregate diverse sources of information.

The paper’s authors are Kim Kaivanto of Lancaster University, and Mark Roulston, Todd Kaplan and Brett Day of the University of Exeter.