Wednesday, March 19, 2025

 

AI technique boosts climate change defenses




Princeton University, Engineering School
Floating house 

image: 

A house, being relocated to higher ground, is transported by barge through Little Egg Inlet along New Jersey’s southeastern coast.

view more 

Credit: Matthew Drews, Rutgers University




Researchers from Princeton and Rutgers University have used reinforcement learning, a method frequently deployed to train artificial intelligence, to show how flexible responses can substantially increase the cost-effectiveness of steps to defend cities like New York against climate change.

The research is part of an attempt to grapple with the effort to make expensive, long-term investments to mitigate the impacts of climate change. The substantial uncertainty related to long-term climate change makes it difficult for political leaders to make investments now that are designed to protect citizens for decades or longer. The difficulty is enhanced by the vast number of variables that go into any such decision and by the fact that the variables are likely to shift in unforeseen ways.

In a March 18 article in the Proceedings of the National Academy of Sciences, the researchers looked at flooding, which has caused increasing damage along the coastal United States and around the world. Governments are building coastal defenses against flooding, but they cannot rely on past conditions to guide defenses that will be needed in the future.

“Defenses are being built to protect coastal regions for the next few decades or longer,” said co-author Ning Lin, a professor of civil and environmental engineering at Princeton. “Climate projects are largely uncertain over long time horizons.”

Lin said that to deal with this uncertainty, planners must be flexible and ready to adapt their plans to future observation of climate conditions. Although this is extremely challenging because of the complexity of climate science, Lin said that harnessing advances in data science can provide an effective strategy.

Robert Kopp, a co-author of the study and a distinguished professor of Earth and Planetary Sciences at Rutgers said uncertainty about the impact of melting ice sheets on sea levels has led to “controversy about how planners should consider the possibility of rapid ice-sheet loss.

Kopp said that flexible approaches can help communities prepare for worst-case scenarios without paying too much for protection.  “Planning for high-end sea-level rise costs a lot, and there’s a good chance it won’t be necessary, but failing to plan for it can be devastating,” he said.

In the PNAS article, the team describes how they simulated efforts to defend Manhattan against sea level rise through the end of this century. The goal was to determine whether any decision-making process that systematically incorporates observations and updating would prove superior to others over such a long period of time. To do this, the researchers simulated decisions by city planners in 10-year intervals up to the year 2100. The researchers compared their decision-making process with existing methods. For example, using the static method of building a seawall for a historic 100-year flood plus a sea-level-rise projection — as currently applied by New York and other coastal cities — is one method; designing a dynamic seawall that will be increased in height over time according to projected future climate change is another.

The researchers graded each method by its efficiency — the cost of defenses plus the estimated damage caused by flooding. For example, spending $10 million on a seawall that allowed $50 million in property damage ($60 million cost) would be less efficient than spending $30 million on a seawall with $15 million in property damage ($45 million cost). 

The researchers found that calculations of a dynamic seawall based on reinforcement learning that systematically incorporates observations of sea level rise over time increased efficiency when compared with other methods (detailed in the paper). Compared with other systems, reinforcement learning lowered costs by 6-36% in a scenario modeling climate change under moderate carbon emissions. For a high emissions scenario, the decrease was 9-77%.

Reinforcement learning is a type of machine learning in which a program makes decisions and receives positive reinforcement based on results. Designers train the program by running it through vast amounts of simulated decisions, and it learns by trial and error rather than through explicit instructions from programmers. This is essentially the way many AI systems operate. It is particularly effective for situations that are extremely complex and subject to rapid changes over time. Computer scientists have used reinforcement learning to train AI to perform tasks such as playing chess, driving cars, and controlling robots and drones. The method has also been used for large systems used to store power or control water supplies.

For their case study, the researchers looked at defenses proposed for low-lying areas in Manhattan. After Hurricane Sandy caused tremendous damage in New York and New Jersey in 2012, the U.S. Army Corps of Engineers proposed constructing a series of seawalls to defend Manhattan called the Big U. Some sections of the Big U are being built and others are in the planning stage. But a final completion plan for the entire system has not yet been set.

The study looked at proposals for the Big U and how defenses against coastal flooding should be changed to respond to future threats to New York from the sea. Flooding is driven both by sea level rise and storms. Both are affected by climate change, which in turn is impacted by global carbon emissions. The researchers wanted to evaluate decision methods every 10 years and allow for adjusting the Big U to reflect available threat data at each interval.

Because the impacts of climate change over a long period remain uncertain, the researchers wanted to study methods of making the best decisions for future changes with the data available at the time. Traditionally, engineers have built protective systems like seawalls and levees to resist historic floods, building protection from floods that would occur only once in 50 or 100 years. But because the climate is changing, such systems no longer work. Building the Big U seawalls to match the highest storm surge over the past 100 years would leave Manhattan vulnerable as climate change drives higher storm surge levels.

The researchers looked at a number of decision-making methods that take into account changing conditions. Most methods allowed for changes based on key variables and some allowed planners to make future projections that would also influence decisions. For example, as a temporary measure, residents could flood-proof their homes (“accommodate”), but eventually, higher sea levels would necessitate a high seawall (“protect”). In some cases, a seawall could prove too costly to protect everyone and residents would be encouraged or compelled to leave threatened communities (“retreat”). In a situation with only a few variables to account for, the benefits of a plan can be estimated relatively easily. But uncertain climate change presents an extremely complex scenario. The researchers showed that reinforcement learning can be used to design integrated strategies, including retreating from low-lying areas, protecting property further inland at higher elevation, and accommodating in between, with a 5-15% reduction in cost compared to the one-dimensional seawall strategy.

Lin, one of the lead researchers, said defending Manhattan is not only complex, but it also requires making difficult decisions under uncertain conditions. For each time interval, planners must make decisions based on observed sea level rise and roughly 80,000 scenarios of future sea level rise and corresponding decision reactions. The difficulty of the decisions compounds as the number of intervals increases.

While climate adaptation decisions are not simple, Lin said reinforcement learning is  a highly efficient system for incorporating observations and updating plans to derive optimal solutions for limiting impacts from extreme events. Reinforcement learning also outperforms previous methods by avoiding losses induced by uncertain changes in future global carbon emissions.

“The analysis of New York City’s situation is by no means unique,” said Michael Oppenheimer, a study co-author and professor of geosciences and international affairs at Princeton. “The method can be applied widely, although its benefit compared to other systems of analysis would vary from place to place.”

The Rutgers Cape Shore Laboratory is protected by a berm at its location on the Delaware Bay in Cape May Courthouse, New Jersey.

Credit

Matthew Drews, Rutgers University


The paper, “Reinforcement learning-based adaptive strategies for climate change adaptation: An application for coastal flood risk management,” was published March 18 in the Proceedings of the National Academy of Sciences. Besides Kopp, Lin and Oppenheimer, co-authors include Kairui Feng, formerly of Princeton and now at Tongji University, Shanghai, China; and Siyuan Xian of Princeton. The authors are part of the Megalopolitan Coastal Transformation Hub, a Rutgers-led, National Science Foundation-funded consortium of research institutions working to advance the science of how coastal climate hazards, landforms and human decisions interact to shape climate risk and advance climate adaptation in the New York City-New Jersey-Philadelphia region.

 

Sea level rise after the last ice age: More knowledge




Deltares
Global sea level rise after the last ice age 

image: 

Global sea level will rise rapidly in the coming century; perhaps by more than 1 metre.  This is primarily caused by  increasing concentrations of greenhouse gases (source: IPCC). Submerged peat layers from the North Sea show that these rates were seen in two earlier phases: 10,300 and 8,300 years ago. Ice sheets melted then due to rapid warming after the last ice age. 

view more 

Credit: Deltares





New geological data has given more insight into the rate and magnitude of global sea level rise following the last ice age, about 11,700 years ago. This information is of great importance to understand the impact global warming has had on the ice caps and on sea level rise. The findings have been published in the scientific journal Nature by researchers from Deltares, Utrecht University, TNO Netherlands Geological Service, Delft University of Technology, the Netherlands Institute for Sea Research (NIOZ), University of Leeds, University of Sheffield, University of Amsterdam, LIAG and BGR.

Better understanding of sea level rise

The new knowledge into the rate of sea level rise during the early Holocene offers an important point of reference for scientists and policymakers, especially as we are now faced with a similar situation with rapidly melting ice sheets due to global warming. The research provides valuable new insights for the future.

As a result of the current rise in greenhouse gas concentrations, climate models by the Intergovernmental Panel on Climate Change (IPCC) expect sea levels to rise by several metres by 2300. Some scenarios indicate a rise of more than one metre per century. An important difference with the early Holocene is that the consequences of sea level rise are far greater today and in the future. This is due to a growth in population and the current presence of infrastructure, cities and economic activity.

Unique dataset in the North Sea region

Global sea level rose quickly following the last ice age.  This was as a result of global warming and the melting of enormous ice caps that covered North America and Europe. Until now, the rate and extent of sea level rise during the early Holocene were not known due to a lack of sound geological data from this period. Using a unique dataset for the North Sea region, the researchers have now been able to make highly accurate calculations for the first time. They analysed a range of boreholes from the area in the North Sea that was once Doggerland, a land bridge between Great Britain and mainland Europe. This area flooded as sea level rose.

By analysing the submerged peat layers from this area, dating them and applying modelling techniques, researchers showed that, during two phases in the early Holocene, rates of global sea level rise briefly peaked at more than a metre per century. By comparison, the current rate of sea level rise in the Netherlands is about 3 mm annually, the equivalent of 30 centimetres per century, and is expected to increase.

Furthermore, until now there has been considerable uncertainty about the total rise between 11,000 and 3,000 years ago. Estimates varied between 32 and 55 metres. The new study has eliminated that uncertainty and it shows that the total rise was around 38 metres.

Groundbreaking research

Marc Hijma, a geologist at Deltares and the lead author of the study: “With this groundbreaking research, we have taken an important step towards a better understanding of sea level rise after the last ice age. By drawing on detailed data for the North Sea region, we can now better unravel the complex interaction between ice sheets, climate, and sea level. This provides insights for both scientists and policymakers, so that we can prepare better for the impacts of current climate change, for example by focusing on climate adaptation.”

The paper, ‘Global sea-level rise in the early Holocene revealed from North Sea peats’, will be published on Wednesday 19 March 17.00 CET in leading scientific journal Nature.


Doggerland 

Credit

Olav Odé, Universiteit Utrecht, Deltares, RMO

Drilling from the North Sea. 

 North Sea. In the center you can clearly see how a layer of brown peat is covered by a layer of gray clay, the moment (8700 years ago) that this part of the North Sea was flooded.

Marc Hijma on the ship that took the boreholes in the North Sea 

Credit

Deltares

 

South Africa and China establish record-breaking 12,900 km ultra-secure quantum satellite link



This milestone marks the first-ever quantum satellite communication link established in the Southern Hemisphere.




Stellenbosch University

Implementation of first quantum satellite link in the Southern Hemisphere 

image: 

Implementation of the first quantum satellite link in the Southern Hemisphere at Stellenbosch University, South Africa. The microsatellite, Jinan 1, launched in the lower earth orbital, transmits secure keys with the portable ground station that is mounted on top of the roof of the engineering building at Stellenbosch University. The experiment was performed in October 2024.

view more 

Credit: Ignus Dreyer




Scientists from South Africa and China have successfully established the world’s longest intercontinental ultra-secure quantum satellite link, spanning 12,900 km. Using the Chinese quantum microsatellite Jinan-1, launched into low Earth orbit, this milestone marks the first-ever quantum satellite communication link established in the Southern Hemisphere.

In this demonstration, quantum keys were generated in real-time through Quantum Key Distribution (QKD), enabling the secure encryption of images transmitted between ground stations in China and South Africa via one-time pad encryption—considered unbreakable. The results from this pioneering experiment from a collaborative research initiative between scientists from Stellenbosch University (South Africa) and the University of Science and Technology of China were published in Nature today

Stellenbosch’s ideal environmental conditions—clear skies and low humidity—allowed the local ground station to achieve an exceptional key generation rate of 1.07 million secure bits during a single satellite pass.

Quantum communication leverages fundamental principles of quantum mechanics, guaranteeing highly secure information transfer. Quantum Key Distribution, a critical component, employs single photons to encode and transmit secure keys. Because single photons cannot be intercepted, copied, or measured without altering their quantum states, this technology provides unparalleled security, even against powerful adversaries.

China is currently at the forefront of quantum communication technology, guided by renowned quantum physicist Prof Jian-Wei Pan. The country’s extensive quantum infrastructure includes a 2,000 km terrestrial fibre-based quantum network connecting 32 trusted nodes across major cities, from Beijing to Shanghai. Prof Juan Yin was instrumental in developing China’s first quantum satellite, Micius, previously demonstrated groundbreaking satellite-based quantum links, including a notable 7,600 km intercontinental link between China and Austria in 2017. For this South Africa-China collaboration, Prof Juan Yin again led the Chinese research team.

The South African research team at Stellenbosch University’s Department of Physics was led by Dr Yaseera Ismail, the lead experimentalist responsible for successfully establishing the quantum satellite link. Prof Francesco Petruccione, Professor of Quantum Computing in the School of Data Science and Computational Thinking and Director of the National Institute for Theoretical and Computational Sciences (NITheCS) at Stellenbosch University, pioneered quantum communication in South Africa, notably developing one of the world’s first fibre-optic quantum communication networks in Durban. This landmark achievement also supports the forthcoming launch of the Stellenbosch Centre for Quantum Science and Technology, which aims to strengthen South Africa’s leading role in quantum research and innovation.

Reflecting on this achievement, Dr Yaseera Ismail emphasised the importance of collaboration: “International and national collaborations are essential to drive cutting-edge research and push scientific boundaries. Implementing the first quantum satellite link in the Southern Hemisphere is an outstanding achievement for South Africa, demonstrating the significant potential to develop a thriving quantum ecosystem.”

Prof Francesco Petruccione added: "This successful demonstration of quantum satellite technology firmly positions South Africa as a significant player in the rapidly evolving global quantum technology ecosystem. Collaborations such as this accelerate scientific breakthroughs, build local expertise, and enable translating advanced quantum research into tangible technological solutions.”

Prof Sibusiso Moyo, Deputy Vice-Chancellor for Research, Innovation and Postgraduate Studies at SU, said: "This breakthrough underscores the importance of supporting and investing in the basic sciences such as quantum computing. We are proud that our researchers are pushing the frontiers of science. This work is in line with SU’s Vision 2040, to be Africa’s leading research-intensive university recognised for its excellence and advancing knowledge in service of society. Congratulations to both teams.”

Interview with Dr Yaseera Isma [VIDEO] | 

Dr Yaseera Ismail, a quantum physicist from the Department of Physics at Stellenbosch University, South Africa, leads the South African team in this project. 

Prof. Francesco Pettrucione, Professor of Quantum Computing in the School for Data Science and Computational Thinking at Stellenbosch University and director of the National Centre for Theoretical and Computational Sciences (NITheCS) in South Africa, pioneered quantum communication in South Africa.

Credit

Stefan Els



Long COVID could be costing U.S. $2.0-$6.5 billion annually



The cost of an individual long COVID-19 case can range from $5,084 to $11,646 per year



CUNY Graduate School of Public Health and Health Policy


New York, NY | March 19, 2025: The current number of long COVID-19 cases may already be costing the U.S. $2.01-6.56 billion dollars per year, according to a study recently published in the Journal of Infectious Diseases. The study also showed that each long COVID case tended to cost society between $5,084 and $11,646. These estimates are based on a computer simulation model developed by the Public Health Informatics, Computational and Operations Research (PHICOR) team based at the CUNY Graduate School of Public Health and Health Policy (CUNY SPH) working with researchers from the CUNY Institute for Implementation Science in Population Health (ISPH) and Baylor College of Medicine.

“Our results quantify the already significant burden of long COVID on society,” says the study’s senior author, CUNY SPH Professor Bruce Y. Lee, MD, MBA, executive director of PHICOR and the CUNY SPH Center for Advanced Technology and Communication in Health (CATCH). “This includes productivity losses that are hitting businesses around the country and healthcare costs that are further straining our healthcare system. These costs could end up trickling down to everyone who pays insurance premiums and taxes.”

The computational model simulates a person with a specified age getting infected with severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), having probabilities of developing long COVID, and if long COVID does result, having probabilities of suffering different long COVID symptoms over time. Each symptom can lead to the person being less productive at school or work and needing different tests, treatments, and visits to clinics, emergency rooms, and hospitals. The model can then track and tabulate the accompanying costs such as productivity losses and healthcare costs.

Running the model showed that a given long COVID case tended to cost society somewhere in the $5,084 to $11,646 range per year. The vast majority of these costs (around 95%) were productivity losses with around a quarter of these being from absenteeism and the rest being from presenteeism, wherein employees come to work, but are less productive. Running all the COVID-19 cases to date through the model generated estimates that there are currently 44.69 to 48.04 million long COVID cases in the U.S., which would cost the country $2.01 to $6.56 billion each year. The vast majority of costs(98.6%) were productivity losses or missed days from work or school and direct medical costs (doctors appointments, medical care, etc.) comprised 1.04% of total costs. This was based on the assumption that 6% of those with COVID-19 will go on to have long COVID. Studies have suggested that anywhere from 6% to 20% of those infected with SARS-CoV-2 have eventually developed long COVID. Increasing this probability from 6% to 10% results in average total societal costs per year increasing to $3.34 billion.

“We are only now fully understanding the burden of human disease and illness resulting from long COVID,” says Peter J. Hotez, MD, PhD, professor and dean of the National School of Tropical Medicine at Baylor College of Medicine and co-author of the study. “As bad as COVID has been in terms of deaths and hospitalizations, it could be eventually matched or even exceeded by the chronic disability from this constellation of sequelae and conditions.”

Declaration of Conflict of Interest: No financial disclosures have been reported by the authors of this paper.

Sarah M Bartsch, Kevin L Chin, Ulrich Strych, Danielle C John, Tej D Shah, Maria Elena Bottazzi, Kelly J O’Shea, McKaylee Robertson, Colleen Weatherwax, Jessie Heneghan, Marie F Martinez, Allan Ciciriello, Sarah Kulkarni, Kavya Velmurugan, Alexis Dibbs, Sheryl A Scannell, Yanhan Shen, Denis Nash, Peter J Hotez, Bruce Y Lee, The Current and Future Burden of Long COVID in the United States, The Journal of Infectious Diseases, 2025;, jiaf030.

Media contact:
Alexis Dibbs
dibbs.alexis@gmail.com

About CUNY SPH
The CUNY Graduate School of Public Health and Health Policy (CUNY SPH) is committed to promoting and sustaining healthier populations in New York City and around the world through excellence in education, research, and service in public health and by advocating for sound policy and practice to advance social justice and improve health outcomes for all.

About PHICOR
Since 2007, PHICOR, Public Health Informatics, Computational, and Operations Research (www.PHICOR.org) has been developing computational methods, models, and tools to help decision makers better understand and address complex systems in health and public health. Follow @PHICORTeam on Facebook, LinkedIn, YouTube, and Twitter for updates.

About the CUNY ISPH
The CUNY Institute for Implementation Science in Population Health (ISPH) was founded on the notion that substantial improvements in population health can be efficiently achieved through better implementation of existing strategies, policies, and interventions across multiple sectors. With that in mind, we study how to translate and scale up evidence-based interventions and policies within clinical and community settings in order to improve population health and reduce health disparities. cunyisph.org

About Baylor College of Medicine
Baylor College of Medicine (www.bcm.edu) in Houston is recognized as a health sciences university and is known for excellence in education, research and patient care. Baylor is a top-ranked medical school and is listed 20th among all U.S. medical schools for National Institutes of Health funding and No. 1 in Texas. Located in the Texas Medical Center, Baylor has affiliations with seven teaching hospitals and jointly owns and operates Baylor St. Luke’s Medical Center, part of St. Luke’s Health. Currently, Baylor has more than 3,000 trainees in medical, graduate, nurse anesthesia, physician assistant, orthotics and genetic counseling as well as residents and postdoctoral fellows.