Tuesday, April 01, 2025

 

ECMWF unveils groundbreaking ML tool for enhanced fire prediction



Europe’s leading weather forecasting centre, ECMWF, takes a major leap forward in fire prediction using cutting-edge data driven technologies




ECMWF

Los Angeles Wildfires 

image: 

ECMWF's new data-driven fire danger model, the Probability of Fire (PoF), is designed to go beyond using only weather parameters. On 7 January 2025, the 5-day-ahead forecast for Southern California from the PoF model, which incorporates multiple data sources (left), shows a better localisation of ongoing fires compared to the traditional, weather-based Fire Weather Index (FWI). The FWI, used as the headline metric in the Global Wildfire Information System (GWIS) and many other regional early warning systems, is shown on the right for the same date. Active fire observations from satellite instruments (MODIS sensor – triangles, VIIRS sensors – circles) for 7 January 2025 are overlaid on both maps. Source: ECMWF

view more 

Credit: Source: ECMWF First published in Nat Comms https://www.nature.com/articles/s41467-025-58097-7




The ability to predict wildfires - such as those that recently devastated Los Angeles and Canada - is advancing rapidly with the help of ML–driven high-quality data.  A new paper, published today (Tuesday 1 April, 16:00 BST | https://www.nature.com/articles/s41467-025-58097-7) in Nature Communications, highlights how the collection and integration of higher-quality data can significantly improve the accuracy and reliability of wildfire predictions.

The paper evaluates how ECMWF's new data-driven fire danger forecasting model, the Probability of Fire (PoF), performed in 2023 and in recent extreme events. ECMWF has been producing fire danger forecasts since 2018 as part of the Copernicus Emergency Management Service(CEMS) led by the Joint Research Centre of the European Commission. In recent years, it has developed innovative approaches using machine learning methods. This effort has moved ECMWF from predicting fire danger – a measure of landscape flammability – to forecasting fire activity. These new products are distributed to the Copernicus Emergency Management Service, and accessible to ECMWF Member States.

“Wildfire prediction as a field of research has been active for decades, which led to the establishment of early warning systems in the 1970s,” explains lead author Dr Francesca Di Giuseppe. “Our new Probability of Fire model incorporates multiple data sources beyond weather to refine predictions. Thanks to a machine learning algorithm, it takes a more holistic approach. Traditional weather-based fire danger indices often fail to pinpoint areas at risk of ignition with enough specificity. This is where ML can help.”

She explains that in the case of the Los Angeles fires, for example, traditional weather forecasts identified broad areas as very flammable but did not accurately target the most probable ignition hotspots. However, by incorporating additional parameters beyond weather - such as human presence, development indices such as road density, and most importantly vegetation abundance and its dryness, Dr Di Giuseppe says by being able to spot the most likely area to experience ignition this helps avoid over-predicting and provides a more accurate and targeted fire risk. “Being able to add all these elements with ML helps refine predictions. For example, we can exclude areas that are hot and dry but unlikely to experience ignition - either because people are not present or there is no fuel to burn,” states Di Giuseppe.  

This comprehensive approach has proven to be more accurate in identifying fire-prone areas, as evidenced by the Southern California fire danger forecast for 7 January 2025. In this event, leveraging this Probability of Fire model was able to provide a far more localised and accurate assessment of high fire danger in areas where fires occurred than the traditionally used Fire Weather Index. By accounting for additional parameters, the model captured the intricate dynamics that drive fire risk. 

Florence Rabier, Director-General of ECMWF, comments: "As witnessed in recent years with some devasting fires, like those in Portugal, Greece and Canada, improving fire forecasts through better data and AI integration will be a game-changer in the years to come. The new Probability of Fire tool has benefited from ECMWF's expertise in AI and ML for medium-range weather predictions (3 to 15 days), and the experts involved have made significant advances in fire prediction using similar data-driven methods. Although fire prediction is a challenging subject as ignition remains an unpredictable process, agencies in charge of providing information, like the European Commission’s Joint Research Centre, have now access to improved tools to help better protect lives, livelihoods, and ecosystems."

The Los Angeles fires provide a stark example of the increased wildfire threat. The period leading up to the fires in 2024 saw unusually wet conditions facilitating rapid vegetation growth, followed by an exceptionally dry autumn and early winter. This pattern, known as ‘hydroclimate whiplash’, is being amplified by climate change. In LA, it created an abundance of dry and flammable vegetation, creating the perfect conditions for the catastrophic fires.

"Understanding these patterns is crucial to accurate fire prediction," says Joe McNorton, another ECMWF expert contributor to the study. "In this research, we found that high-quality data, such as information on vegetation moisture and fuel availability, is the most important factor for improving forecast accuracy. The Probability of Fire model's ability to capture these changes such as the ‘whiplash’ effect demonstrates how ML-driven models are increasingly crucial for accurate wildfire predictions and emergency preparedness.”

The study found that one of the most critical elements in fire prediction is knowledge about fuel availability. Including all data sources improves up to 30% the model’s predictive skill from only including weather. Fuel information is unlike weather data in that it is not easily obtained through direct observations or prediction systems. However, ECMWF has used its weather forecasting expertise and data from the EU’s Copernicus Atmospheric Monitoring Service (CAMS) to construct a modelling framework to extract this critical information.

The research findings also reveal that the integration of ignition sources and fuel status is more critical for fire prediction than complex ML algorithms. This insight provides an opportunity for centres with less computational capacity to implement their own predictive systems, provided they have access to reliable data for training.

Lead Author Francesca Di Giuseppe concludes: "By focusing on high-quality data, even smaller agencies can implement effective fire prediction systems. This is a key takeaway for the global community in our ongoing efforts to combat wildfires."

With wildfires becoming increasingly frequent and severe, driven by climate change, ECMWF's work is positioning Europe at the forefront of ML-driven natural hazard prediction. As wildfires continue to intensify globally, this research emphasizes the importance of better data integration and AI tools in forecasting, helping to reduce the catastrophic impacts of these extreme events.

ECMWF's work in fire prediction is funded by the EU Commission's Joint Research Centre (JRC), which coordinates emergency preparedness for fires across Europe. 

Scientists from the European Commission’ s Joint Research Centre (JRC), which is in charge of fire danger prediction services for wildfires in Europe, concluded that: "The integration of AI and data-driven models is key for the advancement of fire prediction systems across Europe. ECMWF and the JRC share a commitment to using the best available science and technology to protect lives and manage the growing wildfire threat in an increasingly volatile climate.”

ENDS

Further information

On 7 January 2025, a series of devastating wildfires started in Los Angeles. Aided by extensive drought conditions and powerful Santa Ana winds, the fires quickly spiralled out of control. The fires had significant impacts in the wildland–urban interface, where human development meets natural vegetation. The flames devastated thousands of buildings. Lives were tragically lost, and nearly 200,000 residents were forced to evacuate. The Palisades and Eaton fires burned for weeks, overwhelming firefighting efforts. With economic losses projected in excess of $200 billion, this disaster ranks among the costliest in modern United States history.

Video material available at https://vimeo.com/user/102934096/folder/24739018 (private listing under embargo until 1 April 16:00 BST)

Paper: DOI 10.21957/823c252b4f (not active until embargo lifted)

 

 

The food and fuel that farms itself



Cold Spring Harbor Laboratory
Duckweed at Prospect Park Brooklyn 

image: 

You may have encountered duckweed thousands of times. The tiny aquatic plant can grow practically anywhere there’s standing water and sunlight, including here at Brooklyn’s iconic Prospect Park

view more 

Credit: Evan Ernst/CSHL




Under the right conditions, duckweed essentially farms itself. Wastewater, ponds, puddles, swamps—you name it. If there’s enough sunlight and carbon dioxide, the aquatic plant can grow freely. But that’s not all that makes it intriguing. Packed inside duckweed’s tiny fronds is enormous potential as a soil enricher, a fuel source, protein-rich foods, and more. New findings at Cold Spring Harbor Laboratory (CSHL) could help bring all that potential to life.

CSHL Professor and HHMI Investigator Rob Martienssen and Computational Analyst Evan Ernst started working with duckweed over 15 years ago. They see their latest research as one of the most important and eye-opening studies on the plant to date. The team has developed new genome sequences for five duckweed species. The sequences reveal several genes that—when present or absent—may be behind the plant’s unique traits and versatility. Martienssen explains:

“The use of cutting-edge technology allowed us to make a catalog of genes that was extremely accurate. We could tell exactly which genes were there and which were not. A lot of genes that are missing are responsible for features of the plant—open stomata or the lack of roots. We could identify genes that were responsible for each trait.”

Stomata are pores on the surface of plants. They’re crucial for taking in carbon dioxide and releasing oxygen. Open stomata allow for greater intake, making them valuable for carbon capture technology. A lack of roots in some species further increases duckweed’s potential, making it easier for the plant to thrive in any watery environment.

Other species possess traits that showcase duckweed’s potential as a food and fuel source. Some traits promote high protein production, allowing for use as animal feed. Others promote starch accumulation, making the plant ripe for biofuel production. Several industries have taken notice. For now, they’re mostly concerned with the duckweed growing in their backyards. Ernst explains:

“Duckweed agriculture is in a nascent stage. Commercial growers are working with different species in the field, evaluating them in their own local situation. There’s so much variation within one species of duckweed—as much as you can find across all the species. So, having multiple genomes for multiple species is critical.”

Martienssen and Ernst hope their genomes will open the door to a new world of commercial applications. That said, their research may tell us as much about the plant’s past. Their study hints at how duckweed split off into different species 59 million years ago. Earth’s climate was quite extreme back then, so duckweed’s genes just might say something about the planet’s future, too.

Genes required for chromosomal small RNA are missing in certain duckweed species. This may explain the emergence of vigorous inter-species hybrids with three, rather than two, copies of each chromosome.

Credit

Martienssen lab/CSHL

 

Patient- and Community-Level Characteristics Associated With RSV Vaccination



JAMA Network Open





About The Study:

 Knowledge of respiratory syncytial virus (RSV) disease and RSV vaccine eligibility was low in this cross-sectional study of hospitalized adults. Older adults and those with certain medical conditions were more likely to have received vaccine, suggesting appropriate prioritization, but sociodemographic differences in vaccine uptake occurred.



Corresponding Author: To contact the corresponding author, Diya Surie, MD, email dsurie@cdc.gov.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2025.2841)

Editor’s Note: Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

#  #  #

Embed this link to provide your readers free access to the full-text article This link will be live at the embargo time https://jamanetwork.com/journals/jamanetworkopen/fullarticle/10.1001/jamanetworkopen.2025.2841?guestAccessKey=c0957767-f5eb-4d6d-88a4-15c747418b57&utm_source=for_the_media&utm_medium=referral&utm_campaign=ftm_links&utm_content=tfl&utm_term=040125

About JAMA Network Open: JAMA Network Open is an online-only open access general medical journal from the JAMA Network. On weekdays, the journal publishes peer-reviewed clinical research and commentary in more than 40 medical and health subject areas. Every article is free online from the day of publication. 

 

HKUST researchers set to transform food production with AI-enhanced 3D food printing solution





Hong Kong University of Science and Technology
HKUST Researchers Develop an Advanced 3D Food Printing with Simultaneous Cooking and Generative AI Design 

video: 

HKUST Researchers Develop an Advanced 3D Food Printing with Simultaneous Cooking and Generative AI Design

view more 

Credit: HKUST




Researchers at the Hong Kong University of Science and Technology (HKUST) have developed a cutting-edge AI-assisted 3D food printing solution that combines printing with infrared cooking, paving the way for safer, more efficient, and visually appealing food production.

Traditional 3D food printing methods often require additional postprocessing steps, which can have unappealing food ingredients, imperfect shapes, and even potential microbial contamination.

To address these challenges, the team from the Division of Integrative Systems and Design (ISD) at HKUST has developed an AI-enhanced system that combines extrusion-based printing with simultaneous infrared heating for on-the-fly cooking of intricate starch-based foods. Using graphene heaters for cooking, they precisely controlled the cooking process, ensuring that starch-based food items retain their intended shape and quality.

The system is supported by AI-assisted design, which employs generative algorithms and Python scripts to craft intricate food patterns. By leveraging AI, the design process became accessible to even computer novices.

The research tackles issues such as shape retention and contamination risks and opens up exciting prospects for tailored nutrition—benefiting individuals with special dietary needs, including patients with dysphagia. From enhancing meal customization in elderly care centers and central kitchens to meeting special dietary needs and enabling creative culinary experiences in restaurants, the technology offers a versatile solution for various sectors.

“With its potential to streamline food production processes, improve food quality, and cater to individual preferences, this innovation can transform how food is prepared and served in diverse settings, paving the way for a future where personalized and visually appealing food creations are more accessible than ever before,” said Prof. Mitch LI Guijun, Assistant Professor of ISD, who led the research team.

“We’re excited about the potential of this technology to deliver customized, safe, and delicious food with a process that is both efficient and accessible. It represents a significant step forward in how we think about food creation,” Prof. Li added.

Connie LEE Kong-Wai, the paper’s first author and a PhD student at HKUST, said: “We’ve reimagined what 3D food printing can achieve by merging technology and culinary creativity. Our cutting-edge integrative 3D food printing technology can potentially revolutionize personalized food creation.”

The research represents a collaborative effort spanning user-centric design, mechanical engineering, food science, chemistry, and AI. This cross-disciplinary approach brought together diverse expertise to tackle the complex challenges associated with 3D food printing.

Looking ahead, the team plans to refine the technology by examining the preservation of heat-sensitive vitamins and optimizing starch digestibility. Future studies will also focus on consumer acceptance through sensory evaluations involving target users such as children or hospital caretakers, ensuring that the system is ready for real-world applications.

The study, “Advanced 3D Food Printing with Simultaneous Cooking and Generative AI Design”, was recently published in the top-tier international journal Advanced Materials.


The integrative 3D food printer and AI digital food design platform


Produced by the 3D food printer, the starch-based food items retain their intended shape and quality.

a) Schematic diagram and scanning electron microscopy image of the LIG heating film. b) Optical microscope images of the LIG heating film. c) Raman spectra of the LIG heating film. d) Fourier transform infrared spectroscopy of the LIG heating film. e) Absorption spectrum of the printed dough sample.

a–d) SEM images at ×100 magnification of starch-based dough samples processed with air-frying, oven-baking, IR cooking, and laser cooking (from left to right). e–h) SEM images at ×2000 magnification of starch-based dough samples processed with air-frying, oven-baking, IR cooking, and laser cooking (from left to right). i–k) Reconstructed micro-computed tomography 2D cross-section images of the printed dough with different cooking methods. l) Porosity rate (%) of the printed dough with different cooking methods. m) Printability test using a cylindric shape of 30 × 30 × 30 mm as the testing model. n) Shape fidelity value (%) of untreated and IR-cooked samples in terms of length, width, and height. o) 3D printing process of the starch-based dough without heat treatment for 30 min. p) 3D printing process of the starch-based dough with LIG treatment for 30 min.

Credit

HKUST

 

Do authoritarian narratives shape Japanese public opinion?



Researchers find that Japanese voters are broadly susceptible to illiberal narratives from China and Russia, raising concerns about foreign influence




Waseda University

The battle of narratives 

image: 

Illiberal narratives from China and Russia are more persuasive than democratic ones, with broad susceptibility across Japanese society.

view more 

Credit: Tetsuro Kobayashi from Waseda University, Japan




An authoritarian regime is a type of government system where power is restricted in the hands of a single leader or a group, limiting the participation of citizens in decision-making. As authoritarian states like China and Russia intensify their global information campaigns, a new study highlights Japan’s vulnerability to illiberal narratives. This research, led by Professor Tetsuro Kobayashi of Waseda University, Research Associate Yuan Zhou of Kobe University, Ph.D. student Lungta Seki of Koç University, and Professor Asako Miura of Osaka University, was published online on March 12, 2025 in the journal Democratization. The study revealed that these narratives are more persuasive than mainstream democratic messaging, revealing potential risks to Japan’s political discourse.

Kobayashi and team examined how Japanese citizens respond to narratives propagated by authoritarian regimes compared to those dominant in democratic societies. Using an online survey experiment, they exposed participants to both types of narratives and measured shifts in opinion. The results were striking: Illiberal narratives exerted a stronger influence than mainstream democratic narratives, even when their sources were explicitly identified. “The results showed that both illiberal and mainstream narratives produced persuasive effects in the intended direction; however, the illiberal narratives tended to have a greater impact than mainstream narratives,” writes Kobayashi.

This finding is particularly concerning given the study’s second key result: The expected moderating effects of political knowledge, authoritarian tendencies, and belief in conspiracy theories were absent among the participants. Unlike a similar study conducted in Germany, where individuals with low political knowledge and strong authoritarian leanings were more vulnerable, Japanese respondents were broadly susceptible to illiberal narratives regardless of their political profile. “The persuasive effects of illiberal narratives were not moderated by the political knowledge of the participants. This suggests that it is not only specific segments of the Japanese population that are influenced by illiberal narratives, but rather that Japanese individuals are broadly susceptible,” writes Kobayashi.

When both democratic and illiberal narratives were introduced together, their effects largely canceled each other. However, when an illiberal narrative followed a democratic one, it retained its persuasive effect. This means that once an illiberal narrative enters public discourse, it may be more difficult to counteract, even in a country where democratic narratives are dominant. This suggests that mainstream narratives do not effectively inoculate the public against later exposure to illiberal messaging.

The implications extend beyond Japan’s domestic politics. As Japan plays a crucial role in the US-led liberal international order, the infiltration of illiberal narratives could have consequences for its foreign policy alignment. If public opinion in Japan shifts in favor of authoritarian states, it could weaken Japan’s support for democratic alliances and international norms.

This raises concerns about Japan’s resilience against cognitive warfare. As authoritarian states refine their influence operations, Japan may need stronger measures to counteract manipulative narratives. The study highlights the urgency of strengthening democratic resilience by honing effective counternarratives and increasing public awareness of foreign influence tactics,” concludes Kobayashi. If left unaddressed, the spread of illiberal narratives could undermine trust in democratic institutions and shift public opinion in ways that favor authoritarian interests, underscoring the urgency of fostering media literacy and public awareness to safeguard democratic discourse in an increasingly contested information space.

 

***

 

Reference
DOI: 10.1080/13510347.2025.2475472

 

Authors: Tetsuro Kobayashi1, Yuan Zhou2, Lungta Seki3, and Asako Miura4

 

Affiliations:

1School of Political Science and Economics, Waseda University

2Graduate School of Law, Kobe University

3Graduate School of Social Sciences and Humanities, Koç University

4Graduate School of Human Sciences, Osaka University

 

About Waseda University
Located in the heart of Tokyo, Waseda University is a leading private research university that has long been dedicated to academic excellence, innovative research, and civic engagement at both the local and global levels since 1882. The University has produced many changemakers in its history, including nine prime ministers and many leaders in business, science and technology, literature, sports, and film. Waseda has strong collaborations with overseas research institutions and is committed to advancing cutting-edge research and developing leaders who can contribute to the resolution of complex, global social issues. The University has set a target of achieving a zero-carbon campus by 2032, in line with the Sustainable Development Goals (SDGs) adopted by the United Nations in 2015. 

To learn more about Waseda University, visit https://www.waseda.jp/top/en  

 

About Professor Tetsuro Kobayashi
Dr. Tetsuro Kobayashi is a Professor at Waseda University's School of Political Science and Economics. He earned his Ph.D. in social psychology from the University of Tokyo in 2009. Before joining Waseda University, he held positions at the National Institute of Informatics in Japan and the City University of Hong Kong. His research interests encompass political communication, political psychology, and public opinion. Kobayashi's research has been published in prestigious peer-reviewed journals spanning the fields of political science, communication, and psychology.

 

‘Unprecedented’ recent floods swamped by previous highs




University of Exeter





Recent “unprecedented” floods are not exceptional if we look further into the past, new research shows.

A team of scientists – led by the University of Exeter – used geological palaeo-flood records to examine extreme floods in western Europe over several thousand years.

The study finds many previous floods exceeded recent extremes, highlighting the need to use these palaeo records – not just river gauge data that typically exists for the last century or less.

The researchers challenge the idea that recent floods can be attributed solely to greenhouse gas emissions, but they warn that the combination of natural extremes and global warming could lead to truly extraordinary floods.

“In recent years, floods around the world – including in Pakistan, Spain and Germany – have killed thousands of people and caused enormous damage,” said Professor Stephan Harrison, from the Department of Earth and Environmental Sciences at Exeter’s Penryn Campus in Cornwall.

“Such floods are seen as ‘unprecedented’ – but if you look back over the last few thousand years, that’s not the case.

“In fact, floods we call unprecedented may be nowhere near the most extreme that have happened in the past.”

Palaeo-flood records use a range of evidence including floodplain sediments, dating sand grains and past movement of boulders to identify past extremes.

Professor Harrison added: “You need that knowledge of the past if you’re going to understand the present and make predictions about the future.

“Coupling evidence of past extremes with the extra pressure now being added by human-caused global warming – which causes more extreme weather – you see a risk of genuinely unprecedented floods emerging.”

Projects such as housing and infrastructure are built to be resilient to extreme floods – based on assumptions such as a “one-in-200 year” or “one-in-400 year” flood event.

“If we rely on relatively short-term records, we can’t say what a ‘one-in-200 year’ flood is – and therefore our resilient infrastructure may not be so resilient after all,” said Professor Mark Macklin.

“This has profound implications for flood planning and climate adaptation policy.”

The study examined palaeo-flood records for the Lower Rhine (Germany and Netherlands), the Upper Severn (UK) and rivers around Valencia (Spain).

In the Rhine, records for about 8,000 years show at least 12 floods that are likely to have exceeded modern peaks.

The Severn analysis shows that floods in the last 72 years of monitoring are not exceptional in the context of palaeo-flood records of the last 4,000 years.

The largest flood in the Upper Severn occurred in about 250 BCE and is estimated to have had a peak discharge 50% larger than the damaging floods in the year 2000.

The paper, published in the journal Climatic Change, is entitled: “Robust climate attribution and recurrence probabilities of modern floods needs palaeoflood science.”