Wednesday, January 17, 2024

 

Cannabis activates specific hunger neurons in brain



Peer-Reviewed Publication

WASHINGTON STATE UNIVERSITY


PULLMAN, Wash. – While it is well known that cannabis can cause the munchies, researchers have now revealed a mechanism in the brain that promotes appetite in a set of animal studies at Washington State University.

The discovery, detailed in the journal Scientific Reports, could pave the way for refined therapeutics to treat appetite disorders faced by cancer patients as well as anorexia and potentially obesity.

After exposing mice to vaporized cannabis sativa, researchers used calcium imaging technology, which is similar to a brain MRI, to determine how their brain cells responded. They observed that cannabis activated a set of cells in the hypothalamus when the rodents anticipated and consumed palatable food that were not activated in unexposed mice.

“When the mice are given cannabis, neurons come on that typically are not active,” said Jon Davis, an assistant professor of neuroscience at WSU and corresponding author on the paper. “There is something important happening in the hypothalamus after vapor cannabis.”

Calcium imaging has been used to study the brain’s reactions to food by other researchers, but this is the first known study to use it to understand those features following cannabis exposure.

As part of this research, the researchers also determined that the cannabinoid-1 receptor, a known cannabis target, controlled the activity of a well-known set of “feeding” cells in the hypothalamus, called Agouti Related Protein neurons. With this information, they used a “chemogenetic” technique, which acts like a molecular light switch, to home in on these neurons when animals were exposed to cannabis. When these neurons were turned off, cannabis no longer promoted appetite.

“We now know one of the ways that the brain responds to recreational-type cannabis to promote appetite,” said Davis.

This work builds on previous research on cannabis and appetite from Davis’ lab, which was among the first to use whole vaporized cannabis plant matter in animal studies instead of injected THC—in an effort to better mimic how cannabis is used by humans.  In the previous work, the researchers identified genetic changes in the hypothalamus in response to cannabis, so in this study, Davis and his colleagues focused on that area.

The current research received support from the Alcohol and Drug Abuse Research Program, the National Institute on Alcohol Abuse and Alcoholism, and the U.S. Department of Agriculture as well as by funds provided by the state of Washington Initiative Measure No. 171.

 

Calibrating from the cosmos


A newly funded calibration service will take LuSEE-Night to ultimate precision, paving the way for next-generation radio cosmology experiments


Business Announcement

DOE/BROOKHAVEN NATIONAL LABORATORY





A unique “passenger” is joining an upcoming mission to the moon.

In 2026, physicists are planning to operate a radio telescope on the far side of the moon—an unforgiving environment that poses tremendous challenges for research equipment to survive, but also the promise of enormous scientific payoff. Called LuSEE-Night, the project aims to access lingering radio waves from the universe’s ancient past, peering into an era of the cosmos that’s never been observed before.

Now, thanks to new funding from NASA, the project has added a state-of-the-art calibrator to the mission. This calibrator will not only ensure measurements from LuSEE-Night are accurate but also set the stage for more sophisticated telescopes to reside beyond Earth.

A cosmologist’s dream calibrator

All telescopes require calibration—a system of assessing the quality and wavelengths of light they collect—but calibrating LuSEE-Night is a substantial challenge.

First, the range of calibration techniques that can be applied to LuSEE-Night is far more limited than what is available for optical telescopes. Optical telescopes can move; they can focus on a star, look away, and then move back. By collecting measurements of known and unknown celestial objects, scientists can compare the two as a method of calibration. LuSEE-Night, on the other hand, will be completely stationary, operating with fixed antennas that “view” the entire sky at once.

So, how do scientists typically calibrate radio telescopes? They move the signal, not the telescope.

For traditional, ground-based radio telescopes, scientists have often tried to send a point source, usually an artificial radio source mounted on a drone, above the telescope. As the drone crisscrosses through the sky over the telescope, scientists can observe how the telescope responds and calibrate the instrument accordingly. But the way drones move and the chance for them to be blown off course by the wind makes it challenging to capture precise measurements. Not only is achieving this level of precision a necessity for a far-off lunar telescope like LuSEE-Night but flying drones from the moon is just not feasible.

LuSEE-Night will also take on the challenge of exclusively measuring very faint, low-frequency radio waves.

“The lower the radio frequency you’re trying to measure, the harder the instrument is to calibrate,” said LuSEE-Night science collaboration spokesperson Anže Slosar, a physicist at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory. LuSEE-Night is a collaborative effort between NASA and DOE, with Brookhaven Lab leading DOE’s role in the project and DOE’s Lawrence Berkeley National Lab providing key technical support.

All these challenges encouraged NASA to support a cosmologist’s dream calibrator: a calibrator in orbit around the moon. NASA achieved this by purchasing a calibration service in the free market.

“The calibration service will be coming from a satellite in orbit. It is like the ultimate drone, an ideal point source,” Slosar said. “You know exactly where it is; it is very stable and it is, for all practical purposes, infinitely far away—the same as real celestial sources.”

Launching a satellite into orbit is far too expensive for calibrating ground-based telescopes.

“It’s still not an easy task, but with DOE and NASA collaborating, we made it possible,” Slosar said.

Teamwork makes the dream work

The development and launch of the calibrator, like the rest of the LuSEE-Night project, relies on NASA’s Commercial Lunar Payload Services (CLPS) initiative. Through CLPS, NASA contracts private companies to carry out low-cost missions to the moon. And now, NASA has contracted Firefly Aerospace, Inc., the company already tasked with launching LuSEE-Night, to build the new calibrator; it’s the first time NASA has asked for a calibration service from the CLPS pool of providers. 

“The calibrator will be a sophisticated radiofrequency transmitter with a downward-looking antenna,” said Paul O’Connor, a senior scientist in Brookhaven’s Instrumentation Division and LuSEE-Night Project Instrument Scientist. “It will be in lunar orbit and emit a calibration signal every time it rises above the horizon, and LuSEE-Night will pick the signal up. Because we will always know exactly where the calibrator is and its signal intensity, we will also know exactly how much space radiation is coming from each direction we are studying. This will enable us to understand the nuances of our instrument’s response, such as its sensitivity to polarization and how the incoming radiation interacts with the lunar regolith.”

This design will enable the LuSEE-Night collaboration to achieve “absolute calibration,” which Slosar says can rarely be achieved from the ground, let alone from the moon. Scientists expect the calibrator to reduce uncertainty from 20% to about 1%.

“While the basic technique is similar to that of drone calibration, this technique is ultimately much more sophisticated,” Slosar added. “Instead of blinking or beeping an intermittent noise that we would have to distinguish from other noises in space, this calibrator will give us a known signal we can easily recognize, even when it is drowned in the much brighter emission from our own galaxy.”

Ready for launch

The calibrator will travel into space on the same rocket as LuSEE-Night, becoming the latest passenger among a suite of scientific instruments headed to the moon—each with its own destination and timed arrival.

“When the transfer vehicle gets close to the moon, first, the landing equipment and the European Space Agency’s Lunar Pathfinder communications satellite will detach and go into orbit. Then, the lander will shuttle the telescope to the moon’s surface. Finally, the communications module for the lander and the calibrator go into orbit, where the calibrator will remain,” Slosar said.

Five Earth days after LuSEE-Night lands on the lunar far side, Firefly Aerospace will remotely turn on the calibrator to ensure it is working. Since the lander will still be emitting interfering signals, these early data will require careful analysis. But once the first lunar sunset arrives and the lander turns off, then the true scientific mission of LuSEE-Night begins.

After 50 Earth days, the team will have gathered sufficient data from the telescope to achieve single-percent-level calibration.

“Our instruments are set up to do calibrations and normal science operations simultaneously so that we can collect data throughout the first lunar night,” Slosar said. 

By the second lunar night, the calibrator gets switched off because, in addition to demonstrating the calibration technique, launching this satellite into orbit is also an exercise in international relations.

“People long ago realized that the far side of the moon is unique space. It is one of the most radio-quiet places in existence,” Slosar said. “Therefore, international treaties were signed, stating that nobody should pollute the lunar spectrum at radio frequencies below 300 megahertz, which are the most precious for radio astronomy. But now we have this calibrator that will emit radio frequencies, so the Federal Communications Commission must request a time-limited waiver from the International Telecommunications Union. In this case, we will have one and a half lunar days, or 50 Earth days, before it must switch off.”

As the 50-Earth-day clock ticks down, scientists at Brookhaven Lab, empowered by interagency collaboration and public-private partnerships, will carry out one of the most ambitious radio cosmology experiments in history. Their work could help uncover answers to some of the universe’s biggest mysteries, such as the formation of the universe itself.

Brookhaven National Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.

Follow @BrookhavenLab on social media. Find us on InstagramLinkedInX, and Facebook

 

Largest-ever study of ocean DNA has created essential catalog of marine life

From biotechnology innovation to tracking climate change impacts, the KMAP Global Ocean Gene Catalog 1.0 offers diverse applications for science and society

Peer-Reviewed Publication

FRONTIERS

The KMAP Ocean Gene Catalog 1.0 is the largest database of marine microbes to date 

IMAGE: 

THE KMAP OCEAN GENE CATALOG 1.0 IS THE LARGEST DATABASE OF MARINE MICROBES TO DATE

view more 

CREDIT: THE KMAP OCEAN GENE CATALOG 1.0 IS THE LARGEST DATABASE OF MARINE MICROBES TO DATE

The ocean is the world’s largest habitat, yet much of its biodiversity is still unknown. A study published in Frontiers in Science marks a significant breakthrough, reporting the largest and most comprehensive database of marine microbes to date – matched with biological function, location, and habitat type.  

“The KMAP Global Ocean Gene Catalog 1.0 is a leap toward understanding the ocean’s full diversity, containing more than 317 million gene groups from marine organisms around the world,” said lead author Elisa Laiolo of the King Abdullah University of Science and Technology (KAUST) in Saudi Arabia. “The catalog focuses on marine microbes, which greatly impact human lives through their influence on the ocean’s health and the Earth’s climate.”  

“The catalog is freely available through the KAUST Metagenomic Analysis Platform (KMAP),” added the study’s senior author, Prof Carlos Duarte, a faculty member at KAUST. “Scientists can access the catalog remotely to investigate how different ocean ecosystems work, track the impact of pollution and global warming, and search for biotechnology applications such as new antibiotics or new ways to break down plastics – the possibilities are endless!” 

A feat of technological innovation and scientific collaboration 

Researchers have been mapping marine biodiversity for hundreds of years, but faced various challenges to creating a full atlas of ocean life. One is that most marine organisms cannot be studied in a laboratory. The advent of DNA sequencing technologies overcame this by allowing organisms to be identified directly from ocean water and sediments.  

“Since each species has its own set of genes, we can identify which organisms are in an ocean sample by analyzing its genetic material,” Laiolo explained. “Two technological advances have made this possible at scale.  

“The first is the enormous increase in speed, and decrease in cost, of DNA sequencing technologies. This has allowed researchers to sequence all the genetic material in thousands of ocean samples.”  

“The second is the development of massive computational power and AI technologies, which make it possible to analyze these millions of sequences.” 

The team used KMAP to scan DNA sequences from 2,102 ocean samples taken at different depths and locations around the world. This advanced computing infrastructure identified 317.5 million gene groups, of which more than half could be classified according to organism type and gene function. By matching this information with the sample location and habitat type, the resulting catalog provides unprecedented information on which microbes live where and what they do. 

“This achievement reflects the critical importance of open science,” said Duarte. “Building the catalog was only possible thanks to ambitious global sailing expeditions where the samples were collected and the sharing of the samples’ DNA in the open-access European Nucleotide Archive. We are continuing these collaborative efforts by making the catalog freely available.” 

A wealth of scientific and industrial applications 

The catalog has already revealed a difference in microbial activity in the water column and ocean floor, as well as a surprising number of fungi living in the ‘twilight’ mesopelagic zone. These and other insights will help scientists understand how microbes living in different habitats shape ecosystems, contribute to ocean health, and influence the climate.  

The catalog also serves as a baseline for tracking the effect of human impacts like pollution and global warming on marine life. And it offers a wealth of genetic material that researchers can scan for novel genes that could be used for drug development, energy, and agriculture. 



Toward a global ocean genome 

The KMAP Ocean Gene Catalog 1.0 is a first step towards developing an atlas of the global ocean genome, which will document every gene from every marine species worldwide – from bacteria and fungi to plants and animals.  

“Our analysis highlights the need to continue sampling the oceans, focusing on areas that are under-studied, such as the deep sea and the ocean floor. Also, since the ocean is forever changing – both due to human activity and to natural processes – the catalog will need continual updating,” said Laiolo.  

Duarte cautions that despite its clear benefit, the future of the catalog is uncertain. A major obstacle is the status of international legislation on benefit-sharing from discoveries made in international waters.  

“While the 2023 Treaty of the High Seas offers some solutions, it may inadvertently impede research by reducing incentives for companies and governments to invest. Such uncertainty must be resolved now we have reached the point where genetic and artificial intelligence technologies could unlock unprecedented innovation and progress in blue biotechnology,” he concluded.

The article is part of a Frontiers in Science multimedia article hub featuring an explainer as well as an editorial, viewpoint, and policy outlook from other eminent experts: Prof Enric Sala (National Geographic Society, USA), Prof Andreas Teske (University of North Carolina at Chapel Hill, USA), and Peggy Rodgers Kalas (International Ocean Policy Advisor to the Oceano Azul Foundation, and former Director of the High Seas Alliance).  

 

Rice engineers propose hybrid urban water sourcing model


Reclaimed wastewater could make supply systems more resilient


Peer-Reviewed Publication

RICE UNIVERSITY

researchers 

IMAGE: 

LAUREN STADLER (FROM LEFT), QILIN LI AND LEONARDO DUEÑAS-OSORIO
 

view more 

CREDIT: CREDIT: RICE UNIVERSITY





HOUSTON – (Jan. 16, 2024) – Houston’s water and wastewater system could be more resilient with the development of hybrid urban water supply systems that combine conventional, centralized water sources with reclaimed wastewater, according to a study by Rice University engineers published in Nature Water.

“Such a system will save energy and help reduce the use of freshwater, a commodity that is becoming critically important around the world,” said Qilin Li, professor of civil and environmental engineering and co-director of the Nanosystems Engineering Research Center for Nanotechnology Enabled Water Treatment (NEWT) at Rice. “Our research shows that such a system is also more resilient against disruptive events such as hurricanes and flooding. It exhibits lower severity, range of impact and duration of substandard performance when disruption happens.”

Leonardo Dueñas-Osorio, a professor of civil and environmental engineering at Rice and a co-author on the study, said that hybrid water supply systems are more resilient than conventional centralized systems.

“Using Houston’s municipal water system as a case study, we explore the impact of various disruptions including pump station failures, pipe leakage and source water contamination,” Dueñas-Osorio said.

Li and her collaborators have identified vulnerable components in Houston’s existing water and wastewater system and proposed mitigation strategies. The urgency is compounded by the city’s aging infrastructure.

Nationally, the typical age of failing water mains is about 50 years, while the average age of the 1.6 million miles of water and sewer pipes in the U.S. is 45 years, according to the study. Some six billion gallons of treated water ⎯ roughly 15% of the U.S. daily public water supply ⎯ is lost each day from leaky pipes.

In addition, large cities around the world face unprecedented challenges as global climate change, population growth and continuing urbanization contribute to rapid increases in water demand, triggering water access issues and a growing financial burden due to water systems’ maintenance and upgrade needs.

On a small scale, cities around the world, including El Paso, have already enabled reclamation and reuse of municipal wastewater for both drinking and such non-potable uses as irrigation.

“Decreasing dependence on already stressed surface and groundwater resources has become increasingly important,” Li said. “An important challenge is to figure out how to best implement wastewater reclamation activities. The goal is to enhance sustainability and resiliency of big city water infrastructure. Our recent research shows the benefits of decentralized wastewater treatment and reuse.”

The study’s co-authors include Lauren Stadler, assistant professor of civil and environmental engineering at Rice; Rice doctoral alum Xiangnan Zhou; and Lu Liu, assistant professor of civil, construction and environmental engineering at Iowa State University.

The research was supported by the National Science Foundation (1707117, 1449500).

-30-

This release was authored by Patrick Kurp and can be found online at news.rice.edu.

Follow Rice News and Media Relations via Twitter @RiceUNews.

Publication details:

Hybrid wastewater treatment and reuse enhances urban water system resilience to disruptive incidents | Nature Water | DOI: 10.1038/s44221-023-00166-6

Authors: Lu Liu, Xiangnan Zhou, Leonardo Dueñas-Osorio, Lauren Stadler and Qilin Li

https://www.nature.com/articles/s44221-023-00166-6#citeas

Image downloads:

https://news-network.rice.edu/news/files/2024/01/IMG_0500-ab60ea3de300407f.jpg
CAPTION: Lauren Stadler (from left), Qilin Li and Leonardo Dueñas-Osorio (Credit: Rice University)

https://news-network.rice.edu/news/files/2024/01/IMG_0499-4a97dabed6e6558e.jpg
CAPTION: Lauren Stadler (from left), Qilin Li and Leonardo Dueñas-Osorio (Credit: Rice University)

Links:

NEWT: https://newtcenter.org/
Department of Civil and Environmental Engineering: https://cee.rice.edu/
George R. Brown School of Engineering: https://engineering.rice.edu/
Dueñas-Osorio Research Group: https://duenas-osorio.rice.edu/
Li Research Group: https://qilinli.rice.edu/

About Rice:

Located on a 300-acre forested campus in Houston, Rice University is consistently ranked among the nation’s top 20 universities by U.S. News & World Report. Rice has highly respected schools of architecture, business, continuing studies, engineering, humanities, music, natural sciences and social sciences and is home to the Baker Institute for Public Policy. With 4,574 undergraduates and 3,982 graduate students, Rice’s undergraduate student-to-faculty ratio is just under 6-to-1. Its residential college system builds close-knit communities and lifelong friendships, just one reason why Rice is ranked No. 1 for lots of race/class interaction, No. 2 for best-run colleges and No. 12 for quality of life by the Princeton Review. Rice is also rated as a best value among private universities by Kiplinger’s Personal Finance.

 BRAZIL

Brumadinho dam collapse: The danger emerged after the decommissioning



Peer-Reviewed Publication

ETH ZURICH

Mudslide 

IMAGE: 

THE MUDSLIDE CAME WITHOUT WARNING AND DESTROYED THE MINE SITE, NEIGHBOURING SETTLEMENTS AND A RAILWAY BRIDGE, KILLING 270 PEOPLE.

view more 

CREDIT: (PHOTOGRAPH: ERIC MARMOR / IDF SPOKESPERSON, CC BY-​SA 3.0, WIKIMEDIA COMMONS)




The disaster near the small town of Brumadinho in southeastern Brazil occurred shortly after midday: on 25 January 2019, at a nearby iron ore mine, the tailings dam – a storage area for the sludgy, fine-​grained residues from ore processing, or “tailings” – collapsed. A huge mudslide of around 10 million cubic metres of liquefied tailings flooded the site of the mine, destroying neighbouring settlements and taking a railway bridge with it. At least 270 people were killed. The ecosystem of the Paraopeba River downstream of the mine was ruined. Although the dam had a monitoring system, no one had foreseen the disaster.

The Brumadinho dam collapse resulted in several lawsuits against the Vale mining company and the TÜV Süd inspection organisation. Shortly before the accident, the latter had certified that the dam was sufficiently stable. Vale was ordered to pay the equivalent of around 6 billion euros in damages. An investigation committee concluded that the accident was caused by slowly accumulating microscopic displacements (known as creep) in the deposited tailings layers, but did not provide the exact physical mechanism supporting this hypothesis. In particular, uncertainty remained as to why the dam broke in 2019 specifically – three years after the pond was last loaded with new tailings, and why no significant displacements had been detected prior to the collapse.

Physical mechanism explained

A study by Professor Alexander Puzrin, Head of the Institute of Geotechnical Engineering at ETH Zurich and an expert in landslide modelling, has now shed light on the Brumadinho disaster. The paper appears in the journal Communications of Earth and Environment. The scientists used numerical and analytical models to investigate the causes of the dam failure, and they have identified a physical mechanism that explains the mining accident.

The tailings pond was built in 1976. Over the following years, the pond’s earth dam was raised several times by a few metres, as is customary in ore mining, to create additional space for the storage of processing residues. The dam’s steps were placed with an offset on top of each other, much like the steps of a staircase (upstream principle). In the end, the dam consisted of ten steps and was 86 metres tall. When the structure failed in January 2019, the initial rupture occurred at the second dam level. As a result, all ten steps of the earth dam collapsed and, together with the liquefied accumulated tailings, ran out down into the valley as a mudslide.

Creep deformation after decommissioning

The work done by Professor Puzrin’s team now shows how this could happen. According to the new findings, some initial slip surfaces had already appeared in the dammed tailings at the height of the second dam step during the dam’s construction. With the progress of construction, these slip surfaces increased in length but remained too short to cause a collapse. However, after the tailings dam was decommissioned in 2016 – suggests the ETH team’s modelling – these surfaces continued to expand horizontally and eventually reached a critical length. As a result, the layers of tailings began to move, causing the dam to burst under their weight and triggering the deadly mudslide.

According to the model, what caused the slip surface to grow is known as creep deformation. These deformations in the fine-​grained, brittle tailings are tiny, slowly accumulating displacements caused by uneven pressure distribution between the grains in the overlying deposits. “Other triggers such as rainfall and drilling can accelerate slip surface growth,” Puzrin says, “but our model shows that creep deformation alone is sufficient for the slip surface to reach the critical extent that can trigger a dam failure.”

The findings are worrying in two respects: the slip surface that caused the disaster apparently grew at a time when the pond was no longer being loaded with new tailings – in other words, without any additional external load; and the growth of the slip surface did not lead to any significant external deformation of the dam that the monitoring system could have recognised as alarming.

ETH model enables risk analysis

Tailings dams for processing residues from the mining of iron ore and other mineral rocks are used in large numbers worldwide. In each year since 2000, five to six cases have been recorded in which dams have failed or been damaged for various reasons. After the Brumadinho disaster and similar accidents, Brazil decommissioned tailings ponds with dams based on the upstream principle. However, the ETH study now shows that simply no longer loading a pond with new tailings may not avert the danger.

Dam breaches like the one in Brumadinho cannot yet be predicted using conventional monitoring systems. The ETH study opens up new options: “Our model can carry out a risk analysis for existing dams and predict the likelihood of a dam failure,” Puzrin says. If a high risk is identified, there are various plausible courses of action: The risk can be reduced by pumping the water out from the boreholes in the tailings ponds. Or the tailings dam can be dismantled. In urgent cases, endangered villages can be temporarily evacuated to protect the inhabitants until the danger has been averted.

Helping to make earth dams safer

The ETH study’s findings are relevant to all tailings dams for processing residues from ore mining. This is because whenever the residues consist of fine-​​grained, brittle material, in the worst case slip surfaces can form over which the deposited material can slide and damage the dam.

The situation is not directly comparable with reservoirs where water is retained by an earth dam. However, the new findings can also contribute to safety here, as Puzrin points out: “Our findings provide an indication of how to further improve the safety of earth dams in the event of an earthquake, which can generate an initial slip surface. In this respect, our work helps to make dams safer in general.”

During construction, short slip surfaces formed in the deposits of the tailings dam (above). These grew after decommissioning due to creep deformation (centre) and reached the critical length (bottom) that caused the collapse of the dam. 

CREDIT

(Graphic: IGT/ETH Zurich)

 

Extreme weather influencing mortgage payments, study finds


When cyclones and floods hit populated areas, people’s ability to pay their mortgage differs depending on the type and intensity of the extreme weather, research suggests


Peer-Reviewed Publication

UNIVERSITY OF EDINBURGH





When cyclones and floods hit populated areas, people’s ability to pay their mortgage differs depending on the type and intensity of the extreme weather, research suggests.

The more powerful the cyclone, the greater the chance that borrowers will miss or delay mortgage payments, the study found.  

The likelihood of borrowers defaulting on payments or being discouraged from paying off a mortgage early can also increase with heavy rain – particularly in flood prone areas such as along coastlines, researchers found. 

As the effects of climate change are becoming more extreme and frequent, the ability to factor in such risks into mortgage calculations is of increasing importance to major financial institutions and borrowers, researchers say.

The team at the University of Edinburgh used data from almost 70,000 mortgages and more than 3.5 million single payments to model the impact of heavy rains and tropical cyclones on the probability of mortgage risk in Florida in the United States.

The study combined details about mortgage characteristics and performance over time with datasets on cyclones and intense rainfalls to see if it could help lenders predict whether loan payments will be paid and whether a loan will be paid off.

The intensity of the tropical cyclone had a statistically significant impact on the risk of a borrower defaulting on their payments.  

The likelihood of defaulting was more than double for a hurricane of category three or more compared with category two.

They also found heavy rains in areas where flooding was common decreased the likelihood that borrowers would pay back their mortgage early. They found no significant effect of tropical cyclones on borrowers’ willingness to prepay their mortgages.

Both scenarios have negative consequences for lenders and borrowers. Defaulting damages the credit scores of borrowers while early repayment affects the cash flows expected to be received by lenders, the researchers say.

The findings demonstrate that the inclusion of weather-related variables leads to more accurate default and prepayment predictions.

Professor Raffaella Calabrese, of the University of Edinburgh Business School, said: “The new credit scoring models we developed for the study showed it is possible to significantly improve the predictive accuracy of default and prepayments on mortgages when we include weather-related risks. Our results suggest that extreme weather leads to substantial changes in risk and against this background it seems necessary to systematically account for this in credit risk assessment.”

The study is published in the European Journal of Operational Research. An Open Access version of the paper is available here: https://www.research.ed.ac.uk/en/publications/impacts-of-extreme-weather-events-on-mortgage-risks-and-their-evo

Data from the study from a mortgage dataset from Moody’s Analytics – which provides economic research regarding risk, performance and financial modeling.

The study was made possible through a collaboration between University of Missouri-St Louis and Louisiana State University in the United States, Climate Finance Alpha, Centre d’Economie de la Sorbonne, Paris and Prometeia in Italy. The research was supported with funding from the Economic and Social Research Council (ESRC).

UK

Major research investment into national land use transformation


University of Leicester and The James Hutton Institute are co-leading a transdisciplinary hub looking to bridge the gap between science and policy to achieve Net Zero


Grant and Award Announcement

UNIVERSITY OF LEICESTER



  • University of Leicester and The James Hutton Institute are part of a winning consortium of 34 organisations awarded a £6.25 m government grant to establish a ‘Land Use for Net Zero’ (LUNZ) Hub.
  • The LUNZ Hub aims to provide all four UK administrations with the rapid evidence they need to develop policies that will drive the UK land transformation required to achieve Net Zero by 2050.
  • Consortium includes experts from research, farming and industry across issues including green finance, renewable energy, planning, soil health, afforestation and water management.
  • The announcement comes as the LUNZ hub is launched at an event in Leicester today (16 January)

A first of its kind consortium of 34 leading research and stakeholder organisations has been established to help all four UK administrations address land use and agriculture as a major greenhouse gas emitting sector.

The ‘Land Use for Net Zero’ (LUNZ) Hub, co-led by The James Hutton Institute and the University of Leicester, with £6.25 million funding from UK Research and Innovation, will provide UK and devolved nations timely evidence around land use, from renewable energy to soil carbon and green finance, to help drive the land transformations needed to achieve Net Zero by 2050.

It will also play a pivotal role in helping to communicate more widely the critical importance of land and how it’s used as a major carbon sink or source.  

Agriculture and land use have a major impact on greenhouse gas (GHG) emissions, as well as a wide range of other environmental, societal and economic outcomes, but progress towards decarbonisation is lagging behind other sectors. 

The declaration recently announced at COP28 on sustainable agriculture, resilient food systems and climate action states the UK government’s intent to act on land use and climate change by increasing public financial support and scaling science-based solutions, and LUNZ will be a key conduit for these actions.

Achieving the transformational change in land management needed will depend on government access to world-class research and innovation and a novel approach to collaboration across a variety of critical stakeholders.

Hub co-lead of the winning Consortium, Professor Lee-Ann Sutherland (The James Hutton Institute), explained:

“The science behind land use is highly complex.  It is influenced by a range of economic, social and environmental factors, and complicated further by a changing evidence base, novel market forces, the emergence of new data and models, and disruptive technologies such as artificial intelligence. Our aim is to bridge the gap between researchers and policy makers and our work will be focused on meeting specific policy-maker needs, giving them the evidence they need in the format and timeframe they need it. 

“Our Consortium has developed a series of innovative mechanisms to do just that – an Agile Policy Centre, Net Zero Futures Platform, and Creative Methods Lab – each tailored to generate clear, robust answers to urgent questions.”

Equally novel is the approach to stakeholder participation in the Hub, as Hub Co-lead, Professor Heiko Balzter (University of Leicester), explained:

“Creating a fair, realistic path to Net Zero in the land use sector can only be achieved with the involvement of a wide range of stakeholders throughout the process– to provide their expertise, share the Hub’s outputs and ensure its proposals work in practice as well as theory. 

“Our consortium reflects this – ranging from those at the cutting edge of climate change modelling to farmers groups, advisory organisations, non-governmental organisations and an arts collective.  Their range and profile will ensure the Hub’s impact extends throughout society – so everyone can engage in land use transformation – from the food they buy to their holiday, housing and investment decisions.”

At the heart of the challenge is understanding how transformative change can be achieved and predicting the impact of proposed approaches against multiple environmental, societal and economic outcomes.  A central strand of the Hub’s approach will be the development of plausible and innovative net zero scenarios and associated pathways – novel tools based on advanced modelling methodologies that can predict the impacts of different policy interventions across a variety of metrics.