Saturday, December 21, 2024

 

ESA and NASA satellites deliver first joint picture of Greenland Ice Sheet melting



Academics from Northumbria University are part of an international research team which has used data from satellites to track changes in the thickness of the Greenland Ice Sheet.



Peer-Reviewed Publication

Northumbria University

Greenland Ice Sheet Elevation Change from CryoSat and ICESat-2 

video: 

Animation showing where the Greenland Ice Sheet is thinning using data from two satellites

view more 

Credit: Centre for Polar Observation and Modelling, Northumbria University





Academics from Northumbria University are part of an international research team which has used data from satellites to track changes in the thickness of the Greenland Ice Sheet.

Global warming is causing the Ice Sheet to melt and flow more rapidly, raising sea levels and disturbing weather patterns across our planet.

Because of this, precise measurements of its changing shape are of critical importance for tracking and adapting to the effects of climate warming.

Scientists have now delivered the first measurements of Greenland Ice Sheet thickness change using CryoSat-2 and ICESat-2 – the ESA and NASA ice satellite missions.

Both satellites carry altimeters as their primary sensor, but they make use of different technologies to collect their measurements.

CryoSat-2 carries a radar system to determine the Earth’s surface height, while ICESat-2 has a laser system for the same task.

Although radar signals can pass through clouds, they also penetrate into the ice sheet surface and have to be adjusted for this effect.

Laser signals, on the other hand, reflect from the actual surface, but they cannot operate when clouds are present.

The missions are therefore highly complementary, and combining their measurements has been a holy grail for polar science.

A new study from scientists at the UK Centre for Polar Observation and Modelling (CPOM), based at Northumbria University, and published in Geophysical Research Letters shows that CryoSat-2 and ICESat-2 measurements of Greenland Ice Sheet elevation change agree to within 3%.

This confirms that the satellites can be combined to produce a more reliable estimate of ice loss than either could achieve alone. It also means that if one mission were to fail, the other could be relied upon to maintain our record of polar ice change.

Between 2010 and 2023, the Greenland Ice Sheet thinned by 1.2 metres on average. However, thinning across the ice sheet’s margin (the ablation zone) was over five times larger, amounting to 6.4 metres on average.

The most extreme thinning occurred at the ice sheets outlet glaciers, many of which are speeding up.

At Sermeq Kujalleq in west central Greenland (also known as Jakobshavn Isbræ), peak thinning was 67 metres, and at Zachariae Isstrøm in the northeast peak thinning was 75 metres.

Altogether, the ice sheet shrank by 2,347 cubic kilometres across the 13-year survey period – enough to fill Africa’s Lake Victoria.

The biggest changes occurred in 2012 and 2019 when summer temperatures were extremely hot and the ice sheet lost more than 400 cubic kilometres of its volume each year.

Greenland’s ice melting also affects global ocean circulation and disturbs weather patterns. These changes have far-reaching impacts on ecosystems and communities worldwide.

The availability of accurate, up-to-date data on ice sheet changes will be critical in helping us to prepare for and adapt to the impacts of climate change.

Lead author and CPOM researcher Nitin Ravinder said: “We are very excited to have discovered that CryoSat-2 and ICESat-2 are in such close agreement.

“Their complementary nature provides a strong motivation to combine the data sets to produce improved estimates of ice sheet volume and mass changes.

“As ice sheet mass loss is a key contributor to global sea level rise, this is incredibly useful for the scientific community and policymakers.”

The study made use of four years of measurements from both missions, including those collected during the Cryo2ice campaign, a pioneering ESA-NASA partnership initiated in 2020.

By adjusting CryoSat-2’s orbit to synchronise with ICESat-2, ESA enabled the near-simultaneous collection of radar and laser data over the same regions.

This alignment allows scientists to measure snow depth from space, offering unprecedented accuracy in tracking sea and land ice thickness.

Tommaso Parrinello, CryoSat Mission Manager at ESA, expressed optimism about the campaign’s impact:

“CryoSat has provided an invaluable platform for understanding our planet’s ice coverage over the past 14 years, but by aligning our data with ICESat-2, we’ve opened new avenues for precision and insight.

“This collaboration represents an exciting step forward, not just in terms of technology but in how we can better serve scientists and policymakers who rely on our data to understand and mitigate climate impacts.”

Thorsten Markus, project scientist for the ICESat-2 mission at NASA, said: “It is great to see that the data from ‘sister missions’ are providing a consistent picture of the changes going on in Greenland.

“Understanding the similarities and differences between radar and lidar ice sheet height measurements allows us to fully exploit the complementary nature of those satellite missions.

“Studies like this are critical to put a comprehensive time series of the ICESat, CryoSat-2, ICESat-2, and, in the future, CRISTAL missions together.”

ESA’s CryoSat-2 continues to be instrumental in our understanding of climate related changes in polar ice, working alongside NASA’s ICESat-2 to provide robust, accurate data on ice sheet changes.

Together, these missions represent a significant step forward in monitoring polar ice loss and preparing for its global consequences.

CPOM is a partnership of six universities and the British Antarctic Survey (BAS), based at Northumbria University, primarily funded by the National Environment Research Council (NERC) to provide national capability in observation and modelling of the processes that occur in the Polar regions of the Earth.

CPOM uses satellite observations to monitor change in the Polar regions and numerical models to better predict how their ice and oceans might evolve in the future.

By providing long-term capabilities to the scientific community and leading international assessments, CPOM helps global policymakers plan for the effects of climate change and sea level rise.

  

Greenland Ice Sheet

Credit

Prof Andrew Shepherd

 

AI-driven approach reveals hidden hazards of chemical mixtures in rivers




University of Birmingham





Artificial intelligence can provide critical insights into how complex mixtures of chemicals in rivers affect aquatic life – paving the way for better environmental protection. 

A new approach, developed by researchers at the University of Birmingham, demonstrates how advanced artificial intelligence (AI) methods can help identify potentially harmful chemical substances in rivers by monitoring their effects on tiny water fleas (Daphnia).  

The team worked with scientists at the Research Centre for Eco-Environmental Sciences (RCEES), in China, and the Hemholtz Centre for Environmental Research (UFZ), in Germany, to analyse water samples from the Chaobai River system near Beijing. This river system is receiving chemical pollutants from a number of different sources, including agricultural, domestic and industrial. 

Professor John Colbourne is the director of the University of Birmingham’s Centre for Environmental Research and Justice and one of the senior authors on the paper. He expressed optimism that, by building upon these early findings, such technology can one day be deployed to routinely monitor water for toxic substances that would otherwise be undetected.  

He said: “There is a vast array of chemicals in the environment. Water safety cannot be assessed one substance at a time. Now we have the means to monitor the totality of chemicals in sampled water from the environment to uncover what unknown substances act together to produce toxicity to animals, including humans.”   

The results, published in Environmental Science and Technology, reveal that certain mixtures of chemicals can work together to affect important biological processes in aquatic organisms, which are measured by their genes. The combinations of these chemicals create environmental hazards that are potentially greater than when chemicals are present individually. 

The research team used water fleas (Daphnia) as test organisms in the study because these tiny crustaceans are highly sensitive to water quality changes and share many genes with other species, making them excellent indicators of potential environmental hazards. 

"Our innovative approach leverages Daphnia as the sentinel species to uncover potential toxic substances in the environment," explains Dr Xiaojing Li, of the University of Birmingham (UoB) and the lead author of this study. "By using AI methods, we can identify which subsets of chemicals might be particularly harmful to aquatic life, even at low concentrations that wouldn't normally raise concerns." 

Dr Jiarui Zhou, also at the University of Birmingham and co-first author of the paper, who led the development of the AI algorithms, said: “Our approach demonstrates how advanced computational methods can help solve pressing environmental challenges. By analysing vast amounts of biological and chemical data simultaneously, we can better understand and predict environmental risks." 

Professor Luisa Orsini, another senior author of the study, added: “The study's key innovation lies in our data-driven, unbiased approach to uncovering how environmentally relevant concentrations of chemical mixtures can cause harm. This challenges conventional ecotoxicology and paves the way to regulatory adoption of the sentinel species Daphnia, alongside new approach methodologies.” 

Dr Timothy Williams of the University of Birmingham and co-author of the paper also noted that: “Typically, aquatic toxicology studies either use a high concentration of an individual chemical to determine detailed biological responses or only determine apical effects like mortality and altered reproduction after exposure to an environmental sample. However, this study breaks new ground by allowing us to identify key classes of chemicals that affect living organisms within a genuine environmental mixture at relatively low concentration while simultaneously characterising the biomolecular changes elicited.” 

The findings could help improve environmental protection by: 

    •    Identifying previously unknown chemical combinations that pose risks to aquatic life 

    •    Enabling more comprehensive environmental monitoring 

    •    Supporting better-informed regulations for chemical discharge into waterways 

This research was funded by the Royal Society International Collaboration Award, the European Union's Horizon 2020 research and innovation programme, and the Natural Environmental Research Council Innovation People programme. 

ENDS

 

Modern AI systems have achieved Turing's vision, but not exactly how he hoped



Intelligent Computing
Comparison of Turing’s original test with modern Turing-like AI evaluation 

image: 

On the left, Turing’s original test involves a human interrogator (C) trying to identify a machine (A) that imitates a human assistant (B). On the right, the modern Turing-like test replaces the human interrogator with a machine (C) that rigorously evaluates the abilities of another AI system (A), supported by a knowledge graph (B). In both scenarios, the gray-colored players challenge the white-colored machine.

view more 

Credit: Bernardo Gonçalves.




A recent perspective published Nov. 13 in Intelligent Computing, a Science Partner Journal, asserts that today's artificial intelligence systems have finally realized Alan Turing's vision from over 70 years ago: machines that can genuinely learn from experience and engage in human-like conversation. Authored by Bernardo Gonçalves from the University of São Paulo and University of Cambridge, the paper also sheds light on how current energy-hungry transformer-based systems contrast with Turing's prophecy of machines that would develop intelligence naturally, like human children.

Gonçalves' paper points out that transformers, the foundation of modern generative AI systems, have provided what Turing considered "adequate proof" of machine intelligence. These systems, based on "attention mechanisms" and vast-scale learning, can now perform tasks once exclusive to human intellect, such as generating coherent text, solving complex problems, and even discussing abstract ideas.

"Without resorting to preprogramming or special tricks, their intelligence grows as they learn from experience, and to ordinary people, they can appear human-like in conversation," writes Gonçalves. "This means that they can pass the Turing test and that we are now living in one of many possible Turing futures where machines can pass for what they are not."

This achievement traces back to Turing's 1950 concept of the "imitation game," in which a machine would attempt to mimic a human in a remote conversation, deceiving a non-expert judge. The test became a cornerstone of artificial intelligence research, with early AI pioneers John McCarthy and Claude Shannon considering it the "Turing definition of thinking" and Turing’s "strong criterion." Popular culture, too, undeniably reflects Turing’s influence: the HAL-9000 computer in the Stanley Kubrick film 2001: A Space Odyssey famously passed the Turing test with ease.

However, the paper underscores that Turing’s ultimate goal was not simply to create machines that could trick humans into thinking they were intelligent. Instead, he envisioned "child machines" modeled on the natural development of the human brain—systems that would grow and learn over time, ultimately becoming powerful enough to have a meaningful impact on society and the natural world.

The paper highlights concerns about current AI development. While Turing advocated for energy-efficient systems inspired by the natural development of the human brain, today's AI systems consume massive amounts of computing power, raising sustainability concerns. Additionally, the paper draws attention to Turing's ahead-of-his-time societal warnings. He cautioned that automation should affect all levels of society equally, not just displace lower-wage workers while benefiting only a small group of technology owners—an issue that resonates strongly with current debates about AI's impact on employment and social inequality.

Looking ahead, the paper calls for Turing-like AI testing that would introduce machine adversaries and statistical protocols to address emerging challenges such as data contamination and poisoning. These more rigorous evaluation methods will ensure AI systems are tested in ways that reflect real-world complexities, aligning with Turing’s vision of sustainable and ethically guided machine intelligence.

 

Construction materials and household items are a part of a long-term carbon sink called the “technosphere”




Cell Press




We know a lot about how much fossil-derived carbon is emitted to the atmosphere but less about how much is stored in human-made products. In a study publishing December 20 in the Cell Press journal Cell Reports Sustainability, ecological economists estimate that each year, humans add around 400 million tons of fossil carbon to long-lasting products such as plastics, buildings, and human infrastructure. Although these products could be considered a “carbon sink,” proper waste management is essential to prevent them from becoming environmental hazards.

“We have accumulated more carbon in human-made stuff on the planet than there is carbon in the natural world, but we completely overlook it, and those stocks get bigger and bigger,” says ecological economist and senior author Klaus Hubacek (@KlausHubacek) of the University of Groningen.  “The message is to look at stocks rather than just flows.”

Little is known about the stocks of fossil carbon in the “technosphere”—the sum of all human-made artifacts, both in use and discarded. To estimate these stocks, and how they change from year to year, the researchers used publicly available data on the material inputs and outputs from different economic sectors globally for 2011 (the only year for which such material data exist at the global level).

Then, they calculated the amount of carbon flowing in and out of different sectors by using the average carbon content of different products—for example, plastics are estimated to contain 74% fossil carbon on average. The analysis considered not only final products, such as durable plastics and bitumen, but also fossil carbon-based feedstocks that are used as intermediate products in different industries.

They found that in 2011, 9% of extracted fossil carbon was accumulated in long-lasting products within the technosphere—if this same amount of carbon was emitted as CO2, it would almost equal EU emissions that year (3.7 Gt vs. 3.8 Gt of emitted CO2). Construction of buildings and infrastructure accounted for the highest accumulation of fossil carbon (34%). In terms of the type of products, rubber and plastic products accounted for 30% of the accumulated fossil carbon, followed by bitumen (24%; a product used in roads and roofing), and machinery and equipment (16%).

Next, the team extrapolated their 2011 findings to estimate how much fossil carbon flowed into the technosphere between 1995 and 2019 using monetary data from that time span. Overall, they estimated that 8.4 billion tons of fossil carbon were added to the technosphere between 1995 and 2019, equating around 93% of global CO2 emissions in 2019. The amount of carbon entering the technosphere increased yearly from 1995 to 2019.

Many of these fossil carbon-based products end up in landfills or as litter and take decades to centuries to degrade. Based on the average lifetime of buildings, infrastructure, and other products, the researchers estimate that 3.7 billion tons of fossil carbon were disposed of during that period—1.2 tons were brought to landfills, 1.2 tons were incinerated, 1.1 tons were recycled, and the remainder ended up as litter.

“On the one hand, you can consider it as a form of carbon sequestration if this fossil carbon ends up sequestered in landfill, but on the other hand, it poses an environmental hazard, and if you burn it, you increase carbon emissions,” says coauthor and ecological economist Franco Ruzzenenti of the University of Groningen.

Increasing product lifetime and recycling rates are two ways to reduce the amount of fossil carbon entering waste streams, the researchers say. They also emphasize the importance of enacting policies to minimize the discharge of waste from landfills.

Looking ahead, the team plans to conduct a similar analysis of biogenic carbon (i.e., carbon derived from plant materials).

“For the next step, we plan to investigate the long-term potential of biogenic carbon sequestration in durables,” says first author Kaan Hidiroglu of the University of Groningen. “This will allow us to assess whether diversifying carbon sequestration strategies, such as relying on biogenic carbon in durables such as wood materials for construction, could be a viable option.”

###

Cell Reports Sustainability, Hidiroglu et al., “The extent and fate of fossil carbon accumulation in our technosphere” https://cell.com/cell-reports-sustainability/fulltext/S2949-7906(24)00426-9

Cell Reports Sustainability (@CellRepSustain), published by Cell Press, is a monthly gold open access journal that publishes high-quality research and discussion that contribute to understanding and responding to environmental, social-ecological, technological, and energy- and health-related challenges. Visit https://www.cell.com/cell-reports-sustainability/home. To receive Cell Press media alerts, contact press@cell.com.

 

First demonstration of quantum teleportation over busy Internet cables



Advance opens door for lightning-fast quantum applications without specialized infrastructure



Northwestern University




Northwestern University engineers are the first to successfully demonstrate quantum teleportation over a fiberoptic cable already carrying Internet traffic.

The discovery introduces the new possibility of combining quantum communication with existing Internet cables — greatly simplifying the infrastructure required for distributed quantum sensing or computing applications.

The study will be published on Friday (Dec. 20) in the journal Optica.

“This is incredibly exciting because nobody thought it was possible,” said Northwestern’s Prem Kumar, who led the study. “Our work shows a path towards next-generation quantum and classical networks sharing a unified fiberoptic infrastructure. Basically, it opens the door to pushing quantum communications to the next level.”

An expert in quantum communication, Kumar is a professor of electrical and computer engineering at Northwestern’s McCormick School of Engineering, where he directs the Center for Photonic Communication and Computing.

Only limited by the speed of light, quantum teleportation could make communications nearly instantaneous. The process works by harnessing quantum entanglement, a technique in which two particles are linked, regardless of the distance between them. Instead of particles physically traveling to deliver information, entangled particles exchange information over great distances — without physically carrying it.

“In optical communications, all signals are converted to light,” Kumar explained. “While conventional signals for classical communications typically comprise millions of particles of light, quantum information uses single photons.”

Before Kumar’s new study, conventional wisdom suggested that individual photons would drown in cables filled with the millions of light particles carrying classical communications. It would be like a flimsy bicycle trying to navigate through a crowded tunnel of speeding heavy-duty trucks.

Kumar and his team, however, found a way to help the delicate photons steer clear of the busy traffic. After conducting in-depth studies of how light scatters within fiberoptic cables, the researchers found a less crowded wavelength of light to place their photons. Then, they added special filters to reduce noise from regular Internet traffic.

“We carefully studied how light is scattered and placed our photons at a judicial point where that scattering mechanism is minimized,” Kumar said. “We found we could perform quantum communication without interference from the classical channels that are simultaneously present.”

To test the new method, Kumar and his team set up a 30 kilometer-long fiberoptic cable with a photon at either end. Then, they simultaneously sent quantum information and regular Internet traffic through it. Finally, they measured the quality of the quantum information at the receiving end while executing the teleportation protocol by making quantum measurements at the mid-point. The researchers found the quantum information was successfully transmitted — even with busy Internet traffic whizzing by.

Next, Kumar plans to extend the experiments over longer distances. He also plans to use two pairs of entangled photons — rather than one pair — to demonstrate entanglement swapping, another important milestone leading to distributed quantum applications. Finally, his team is exploring the possibility of carrying out experiments over real-world inground optical cables rather than on spools in the lab. But, even with more work to do, Kumar is optimistic.

“Quantum teleportation has the ability to provide quantum connectivity securely between geographically distant nodes,” Kumar said. “But many people have long assumed that nobody would build specialized infrastructure to send particles of light. If we choose the wavelengths properly, we won’t have to build new infrastructure. Classical communications and quantum communications can coexist.”

The study, “Quantum teleportation coexisting with classical communications in optical fiber,” was supported by the U.S. Department of Energy (grant number DE-AC02-07CH11359).

 

Pollinators most vulnerable to rising global temperatures are flies, study shows



Penn State
blue fly pollinating common milkweed 

image: 

Flies play a crucial role as pollinators, second only to bees in terms of the volume of crops and habitat they pollinate. Pictured here is a blue fly pollinating common milkweed (Asclepias syriaca).  

view more 

Credit: Martha B. Moss/Penn State Extension Master Gardener / Penn State



UNIVERSITY PARK, Pa. — Despite their reputation as buzzing nuisances, flies serve a critical role as some of the Earth’s most prolific pollinators — and new research led by Penn State scientists suggests they are increasingly at risk due to rising global temperatures.

In a study recently published in the Journal of Melittology, an international team of researchers looked at the heat tolerance for a variety of species of bees and flies in tropical and subtropical regions of the Americas. Their findings suggest that rising temperatures pose a greater threat to flies than bees, as bees can tolerate much higher temperatures than flies and have a wider habitat range.

“Bees and flies are essential for pollinating plants, both in the wild and in agriculture,” said Margarita López-Uribe, the Lorenzo Langstroth Early Career Associate Professor of Entomology at Penn State, extension specialist of pollinator health and lead author on the study. “However, these vital insects are declining due to habitat loss, pesticides, disease and the growing threat of climate change.”

Flies play a crucial role as pollinators, second only to bees in terms of the volume of crops and habitat they pollinate, López-Uribe explained. Flies are especially important for overall health and diversity of wild ecosystems, as they facilitate reproduction for countless plant species, which in turn provide food and habitat for other organisms. Flies are also increasingly contributing to agriculture. For example, flies are the primary pollinator for cocoa trees that produce the fruits used to make chocolate.

2020 analysis of global crops found that the 105 most widely planted crops that benefit from pollinator have greater than $800 billion gross economic value and include many of the most popular and nutritious fruit, vegetable and nut commodities consumed worldwide. The study also found that flies, specifically hoverflies and blowflies, consistently came right behind bees as a top pollinator.

“It’s time we gave flies some more recognition their role as pollinators,” López-Uribe said. “Flies have a significant role, but they don't get as much attention — and they are vulnerable in all the same ways that bees are.”

Insects are particularly susceptible to rising temperatures, as they have limited ability to regulate their own body temperatures, López-Uribe explained. To understand how different pollinator species might cope with rising global temperatures, the researchers studied the bees and flies’ "critical thermal maximum," or CTMax — the maximum temperature they can withstand before losing the ability to move.

The team found that bees can tolerate much higher temperatures than flies. On average, the CTMax for bees was 2.3 degrees Celsius higher than for flies. They also found that time of day affected the heat tolerance of bees. Bees foraging in the cooler morning hours had a higher CTMax than those active in the warmer afternoons. The study also revealed that geography plays a role in heat tolerance.

The team collected data throughout lockdowns during the COVID-19 pandemic, meaning international students on the project, from Penn State and other universities, conducted research in their home countries. López-Uribe explained that the challenge wound up being an asset, because students were able to collect data on bee and fly species throughout the Americas.

“We sent out all of the equipment to do the study to students throughout the U.S. and South America,” López-Uribe said. “These students were collecting the data in their houses, using their kitchens to understand the thermal ecology these insects could withstand. We effectively were able to provide an international research experience without being able to travel internationally.”

The research team found that flies and bees from high-elevation tropical areas like Cajicá, Colombia, had lower CTMax values than their counterparts in subtropical regions like California and Texas. This suggests that insects in cooler, high-altitude environments may be more vulnerable to even small temperature increases.

“In alpine and subarctic environments, flies are the primary pollinator,” López-Uribe said. “This study shows us that we have entire regions that could lose their primary pollinator as the climate warms, which could be catastrophic for those ecosystems.”

Other Penn State authors on the paper are Ruben Martín-Rojas, graduate student in the department of entomology; José Fuentes, professor of meteorology; Luis Duque, assistant research professor in storage root physiology. Other authors on the paper are Maren Appert of San Diego State University, Alonso Delgado of the University of Texas at El Paso, Abigail Jimenez of California State University, Victor Ramos of Pontificia Universidad Católica del Perú, Andrés F. Herrera-Motta, Diego Riaño-Jimenez and José R. Cure of Universidad Militar Nueva Granada, Bogotá, Colombia, and Victor Gonzalez of the University of Kansas.

The research was supported by a grant from the U.S. National Science Foundation, which supported an International Research Experience for Students program.

 

Inequality weakens local governance and public satisfaction, study finds



By April Toler


University of Notre Dame

Krister Andersson 

image: 

Krister Andersson, professor of sustainable development at the Keough School of Global Affairs, University of Notre Dame.

view more 

Credit: University of Notre Dame




Local governments in developing countries are crucial for providing public services that promote human development and address challenges like extreme weather, unemployment and crumbling infrastructure. Yet, they often face difficulties in implementing cost-effective programs that meet citizens’ diverse needs, particularly in areas with significant socioeconomic inequalities.

recent study, published in World Development and led by University of Notre Dame researcher Krister Andersson, explored the impact of economic and social inequalities on local government performance in Chile (a country with very high socioeconomic inequalities). Specifically, the paper assessed the effectiveness of external policies to alleviate the negative effects of inequality on the quality of local public services.

The study found that socioeconomic inequalities pose significant challenges for local governance, often trapping local governments in a cycle of limited resources, rising inequality and declining capacity to meet citizens’ needs.

“Interventions to help local governments to deal with inequality seem to be most effective when they recognize a leadership role and some autonomy of local leaders,” said Andersson, a professor of sustainable development at Notre Dame’s Keough School of Global Affairs.

Using a dataset spanning 56 local government territories in Chile from 2000 to 2014, the study analyzed citizen satisfaction with local government performance. Multilevel modeling was used to assess how different policy approaches — top-down, sector-based support and bottom-up, demand-driven funding — influence satisfaction levels.

The study evaluated four prominent national programs designed to address inequalities and citizen dissatisfaction. It found only one program to be effective, while the other three either had no impact or worsened the negative link between inequality and quality of local government services.

As socioeconomic disparities widened, the study found that citizen satisfaction with local government programs declined significantly. Poorer territories experienced greater dissatisfaction while wealthier citizens were less affected, as they relied less on government services for daily needs.

Extreme socioeconomic inequalities also constrained local governments’ ability to deliver effective services. Limited resources, inadequate personnel and insufficient infrastructure hindered their capacity to address diverse community needs. Despite significant investments by the Chilean national government to improve infrastructure and public services, many initiatives failed to bridge the gap.

The study, Andersson said, highlights the necessity of strategic, targeted interventions to break the cycle of inequality and enhance public satisfaction with local governance.

“These findings underscore the challenge faced by national governments trying to address inequalities. Simply increasing earmarked funding to local governments may not be sufficient. We see the importance of carefully designed policies and strengthened local governance structures to improve service delivery and address persistent socioeconomic inequalities,” he said.

The research was supported by grants from the U.S. National Science Foundation and the National Fund for Scientific and Technological Development in Chile.