Saturday, May 04, 2024

 

Quantifying U.S. health impacts from gas stoves


People with gas and propane stoves breathe more unhealthy nitrogen dioxide


Peer-Reviewed Publication

STANFORD UNIVERSITY

Data collection 

IMAGE: 

STANFORD PHD STUDENT METTA NICHOLSON OBSERVES A GAS BURNER IN A HOME WHERE SCIENTISTS MEASURED AIR POLLUTION AS PART OF THEIR DATA COLLECTION IN CALIFORNIA, TEXAS, COLORADO, NEW YORK, AND WASHINGTON, D.C..

view more 

CREDIT: ROB JACKSON, STANFORD DOERR SCHOOL OF SUSTAINABILITY




Households with gas or propane stoves regularly breathe unhealthy levels of nitrogen dioxide, a study of air pollution in U.S. homes found.

“I didn’t expect to see pollutant concentrations breach health benchmarks in bedrooms within an hour of gas stove use, and stay there for hours after the stove is turned off,” said Stanford Doerr School of Sustainability Professor Rob Jackson, senior author of the May 3 study in Science Advances. Pollution from gas and propane stoves isn’t just an issue for cooks or people in the kitchen, he said. “It’s the whole family’s problem.”

Among other negative health effects, breathing high levels of nitrogen dioxide, or NO2, over time can intensify asthma attacks and has been linked to decreased lung development in children and early deaths.

Although most exposure to NO2 is caused by cars and trucks burning fossil fuels, the researchers estimate that the mix of pollutants coming from gas and propane stoves overall may be responsible for as many as 200,000 current childhood asthma cases. One quarter of these can be attributed to nitrogen dioxide alone, according to the paper’s authors, who include scientists from Central California Asthma Collaborative, PSE Healthy Energy, and the Harvard T.H. Chan School of Public Health.

“We found that just how much gas you burn in your stove is by far the biggest factor affecting how much you’re exposed. And then, after that, do you have an effective range hood – and do you use it?” said lead study author Yannai Kashtan, a PhD student in Earth system science.

Little room for additional exposure

Beyond asthma cases, the long-term exposure to NO2 in American households with gas stoves is high enough to cause thousands of deaths each year – possibly as many as 19,000 or 40% of the number of deaths linked annually to secondhand smoke. This estimate is based on the researchers’ new measurements and calculations of how much nitrogen dioxide people breathe at home because of gas stoves and the best available data on deaths from long-term exposure to outdoor NO2, which is regulated by the U.S. Environmental Protection Agency.

The death toll estimate is approximate in part because it does not factor in the harmful effects of repeated exposure to extremely high levels of nitrogen dioxide in short bursts, as occurs in homes with gas stoves. It also relies on past studies of health impacts from nitrogen dioxide encountered outdoors, where additional pollutants from vehicles and power plants are present.

The researchers used sensors to measure concentrations of NO2 throughout more than 100 homes of various sizes, layouts, and ventilation methods, before, during, and after stove use. They incorporated these measurements and other data into a model powered by National Institutes for Standards and Technology (NIST) software known as CONTAM for simulating airflow, contaminant transport, and room-by-room occupant exposure in buildings. This allowed them to estimate nationwide averages and short-term exposures under a range of realistic conditions and behaviors, and cross-check model outputs against their home measurements.

The results show that nationwide, typical use of a gas or propane stove increases exposure to nitrogen dioxide by an estimated 4 parts per billion, averaged over a yearThat’s three quarters of the way to the nitrogen dioxide exposure level that the World Health Organization recognizes as unsafe in outdoor air. “That’s excluding all outdoor sources combined, so it makes it much more likely you’re going to exceed the limit,” said Kashtan.

Understanding how gas stoves affect health

The study is the latest in a series from Jackson’s group at Stanford looking at indoor air pollution from gas stoves. Earlier studies documented the rate at which gas stoves emit other pollutants, including the greenhouse gas methane and the carcinogen benzene. But to understand the implications of stove emissions for human health, the researchers needed to find out how much pollutants spread through a home, build up, and eventually dissipate. “We’re moving from measuring how much pollution comes from stoves to how much pollution people actually breathe,” said Jackson, who is the Michelle and Kevin Douglas Provostial Professor in Earth System Science.

With any fuel source, particle pollution can rise from food cooking in a hot pan. The new research confirms that food emits little or no nitrogen dioxide as it cooks, however, and electric stoves produce no NO2. “It’s the fuel, not the food,” said Jackson. “Electric stoves emit no nitrogen dioxide or benzene. If you own a gas or propane stove, you need to reduce pollutant exposures using ventilation.”

Home size matters

Even in larger homes, concentrations of nitrogen dioxide routinely spiked to unhealthy levels during and after cooking even if a range hood was on and venting air outdoors. But people who live in homes smaller than 800 square feet – about the size of a small two-bedroom apartment – are exposed to twice as much nitrogen dioxide over the course of a year compared to the national average, and four times more compared to those living in the largest homes, upwards of 3,000 square feet.

Because home size makes such a difference, there are also differences in exposure across racial, ethnic, and income groups. Compared to the national average, the researchers found long-term NO2 exposure is 60% higher among American Indian and Alaska Native households, and 20% higher among Black and Hispanic or Latino households. This exposure to indoor air pollution from gas stoves compounds the fact that exposure to outdoor sources of nitrogen dioxide pollution, such as vehicle exhaust, is also typically higher among people in poorer, often minority, communities.

“People in poorer communities can’t always afford to change their appliances, or perhaps they rent and can’t replace appliances because they don’t own them,” Jackson said. “People in smaller homes are also breathing more pollution for the same stove use.”

 

Jackson is also a senior fellow at the Stanford Woods Institute for the Environment and Precourt Institute for Energy. Additional Stanford co-authors include Metta Nicholson, a PhD student in the Emmett Interdisciplinary Program in Environment and Resources (E-IPER); Colin Finnegan, a laboratory manager in the Department of Earth System Science; and Earth system science postdoctoral scholars Zutao Ouyang and Anchal Garg.

This research was supported by HT, LLC.


Colin Finnegan of the Stanford Doerr School of Sustainability checks a simmering pot on a gas stove that does not have a range hood. Using a range hood that vents air to the outdoors can dramatically influence how much nitrogen dioxide fills the air in a home.

 

Physics confirms that the enemy of your enemy is, indeed, your friend


New study is first to use statistical physics to corroborate 1940s social balance theory


Peer-Reviewed Publication

NORTHWESTERN UNIVERSITY




Most people have heard the famous phrase “the enemy of my enemy is my friend.”

Now, Northwestern University researchers have used statistical physics to confirm the theory that underlies this famous axiom.

The study will be published on May 3 in the journal Science Advances.

In the 1940s, Austrian psychologist Fritz Heider introduced social balance theory, which explains how humans innately strive to find harmony in their social circles. According to the theory, four rules — an enemy of an enemy is a friend, a friend of a friend is a friend, a friend of an enemy is an enemy and, finally, an enemy of a friend is an enemy — lead to balanced relationships.

Although countless studies have tried to confirm this theory using network science and mathematics, their efforts have fallen short, as networks deviate from perfectly balanced relationships. Hence, the real question is whether social networks are more balanced than expected according to an adequate network model. Most network models were too simplified to fully capture the complexities within human relationships that affect social balance, yielding inconsistent results on whether deviations observed from the network model expectations are in line with the theory of social balance.

The Northwestern team, however, successfully integrated the two key pieces that make Heider’s social framework work. In real life, not everyone knows each other, and some people are more positive than others. Researchers have long known that each factor influences social ties, but existing models could only account for one factor at a time. By simultaneously incorporating both constraints, the researchers’ resulting network model finally confirmed the famous theory some 80 years after Heider first proposed it.

The useful new framework could help researchers better understand social dynamics, including political polarization and international relations, as well as any system that comprises a mixture of positive and negative interactions, such as neural networks or drug combinations.

“We have always thought this social intuition works, but we didn’t know why it worked,” said Northwestern’s István Kovács, the study’s senior author. “All we needed was to figure out the math. If you look through the literature, there are many studies on the theory, but there’s no agreement among them. For decades, we kept getting it wrong. The reason is because real life is complicated. We realized that we needed to take into account both constraints simultaneously: who knows whom and that some people are just friendlier than others.”

“We can finally conclude that social networks align with expectations that were formed 80 years ago,” added Bingjie Hao, the study’s first author. “Our findings also have broad applications for future use. Our mathematics allows us to incorporate constraints on the connections and the preference of different entities in the system. That will be useful for modeling other systems beyond social networks.”

Kovács is an assistant professor of Physics and Astronomy at Northwestern’s Weinberg College of Arts and Sciences. Hao is a postdoctoral researcher in his laboratory.

What is social balance theory?

Using groups of three people, Heider’s social balance theory maintains the assumption that humans strive for comfortable, harmonious relationships. In balanced relationships, all people like each other. Or, if one person dislikes two people, those two are friends. Imbalanced relationships exist when all three people dislike each other, or one person likes two people who dislike each other, leading to anxiety and tension. Studying such frustrated systems led to the 2021 Nobel Prize in physics to Italian theoretical physicist Giorgio Parisi, who shared the prize with climate modelers Syukuro Manabe and Klaus Hasselmann.

“It seems very aligned with social intuition,” Kovács said. “You can see how this would lead to extreme polarization, which we do see today in terms of political polarization. If everyone you like also dislikes all the people you don’t like, then that results in two parties that hate each other.”

However, it has been challenging to collect large-scale data where not only friends but also enemies are listed. With the onset of Big Data in the early 2000s, researchers tried to see if such signed data from social networks could confirm Heider’s theory. When generating networks to test Heider’s rules, individual people serve as nodes. The edges connecting nodes represent the relationships among individuals. 

If the nodes are not friends, then the edge between them is assigned a negative (or hostile) value. If the nodes are friends, then the edge is marked with a positive (or friendly) value. In previous models, edges were assigned positive or negative values at random, without respecting both constraints. None of those studies accurately captured the realities of social networks.

Finding success in constraints

To explore the problem, Kovács and Hao turned to four large-scale, publicly available signed network datasets previously curated by social scientists, including data from (1) user-rated comments on social news site Slashdot; (2) exchanges among Congressional members on the House floor; (3) interactions among Bitcoin traders; and (4) product reviews from consumer review site Epinions.

In their network model, Kovács and Hao did not assign truly random negative or positive values to the edges. For every interaction to be random, every node would need to have an equal chance of encountering one another. In real life, however, not everyone actually knows everyone else within a social network. For example, a person might not ever encounter their friend’s friend, who lives on the other side of the world.

To make their model more realistic, Kovács and Hao distributed positive or negative values based on a statistical model that describes the probability of assigning positive or negative signs to the interactions that exist. That kept the values random — but random within limits given by constraints of the network topology. In addition to who knows whom, the team took into account that some people in life are just friendlier than others. Friendly people are more likely to have more positive — and fewer hostile — interactions.

By introducing these two constraints, the resulting model showed that large-scale social networks consistently align with Heider’s social balance theory. The model also highlighted patterns beyond three nodes. It shows that social balance theory applies to larger graphlets, which involve four and possibly even more nodes.

“We know now that you need to take into account these two constraints,” Kovács said. “Without those, you cannot come up with the right mechanisms. It looks complicated, but it’s actually fairly simple mathematics.”

Insights into polarization and beyond

Kovács and Hao currently are exploring several future directions for this work. In one potential direction, the new model could be used to explore interventions aimed at reducing political polarization. But the researchers say the model could help better understand systems beyond social groups and connections among friends.

“We could look at excitatory and inhibitory connections between neurons in the brain or interactions representing different combinations of drugs to treat disease,” Kovács said. “The social network study was an ideal playground to explore, but our main interest is to go beyond investigating interactions among friends and look at other complex networks.”

The code and data behind the paper, “Proper network randomization is key to assessing social balance,” are available on Github: https://github.com/hbj153/signed_null

 

Stony coral tissue loss disease is shifting the ecological balance of Caribbean reefs



UNIVERSITY OF FLORIDA
SCTLD 

IMAGE: 

A CORAL REEF IN BELIZE SHOWING DEAD CORAL, LIVING HEALTHY ELKHORN CORAL (ACROPORA PALMATA), BOTH DISEASED AND HEALTHY SYMMETRICAL BRAIN CORALS (PSEUDODIPLORIA STRIGOSA), AND A SPOTFIN BUTTERFLYFISH. ALTHOUGH SOME FISH MAY BENEFIT FROM STONY CORAL TISSUE LOSS DISEASE IN THE SHORT TERM, THE EROSION OF DEAD CORAL SKELETONS OVER TIME POSES A GRAVE THREAT TO REEF FISH COMMUNITIES.

view more 

CREDIT: SARA SWAMINATHAN




The outbreak of a deadly disease called stony coral tissue loss disease is destroying susceptible species of coral in the Caribbean while helping other, “weedier” organisms thrive — at least for now — according to a new study published today in Science Advances.

Researchers say the drastic change in the region’s population of corals is sure to disrupt the delicate balance of the ecosystem and threaten marine biodiversity and coastal economies.

“Some fast-growing organisms, like algae, might thrive in the short term,” said the study’s lead author, Sara Swaminathan, an environmental engineering sciences Ph.D. candidate at the University of Florida. “But the loss of the susceptible corals could have long-lasting repercussions.”

Stony coral tissue loss disease, or SCTLD, was first reported off the coast of Miami in 2014 and has since moved throughout the Caribbean, having been identified in 18 countries and territories. Scientists don’t know what causes the disease, but it is thought to be a waterborne pathogen that spreads rapidly across the surface of the coral colony until, in most cases, no living tissue remains.

The research team analyzed existing data from the U.S. Virgin Islands as well as data from other U.S. territories in the Caribbean and western Atlantic, including Florida, Puerto Rico, and Dry Tortugas. They examined the effects of stony coral tissue loss disease on fish and benthic reef communities, which comprises anything living on the sea floor, like coral, algae, and sponges.

They found that the disease not only reduces susceptible coral populations but also diminishes crustose coralline algae, the resilient pink crust that is crucial for building reef structure. Consequently, certain fast-growing, weedy species, including macroalgae, cyanobacteria, and fire coral, thrive in the absence of competitors, spreading into the vacant spaces left by the decimated corals.

Swaminathan explained that fast-growing species benefitting from the disease create a more seaweed-dominated environment compared to the rock-hard reef structures.

“Macroalgae doesn’t support as much biodiversity because it doesn’t create a hard habitat,” she said. “It might be a positive for herbivores but not for other organisms that need places to settle and grow, hide, or mate.”

In fact, the researchers found that the disease’s impact varies among types of fish, and some associations are positive. The study revealed that the rugosity of the coral — the roughness of the habitat — plays more of a factor for the fish than whether the coral is alive or dead.

“Some infectious diseases can affect entire ecosystems, but SCTLD is particularly impactful,” said Kevin Lafferty, a senior scientist with the U.S. Geological Survey and leading expert on marine diseases. “And its impacts are complex, with some winners among the losers.”

In addition to Swaminathan and Lafferty, the research team included Andrew Altieri, an assistant professor in the Herbert Wertheim College of Engineering and associate director of UF’s Center for Coastal Solutions, and Nicole Knight, a post-doctoral researcher at the University of British Columbia. The work was funded by the U.S. Geological Survey and the National Science Foundation.

Across the Caribbean, international agencies, marine biologists and conservationists are responding to the environmental crisis, working to understand the mechanisms driving stony coral tissue loss disease and developing potential treatments. Probiotics have shown promise in slowing down or stopping active lesions when applied to affected corals, and so far, the disease has not spread beyond the western Atlantic and Caribbean.

“Coral reefs are biodiversity hotspots that serve as vital habitats for countless marine species, providing food, shelter, and breeding grounds,” Altieri said. “They are natural protectors of our coastlines, and the loss of these reefs threatens the livelihoods of coastal communities dependent on fishing and tourism. We need to protect and preserve them.”

-30-

 

Research quantifies “gap” in carbon removal for first time



UNIVERSITY OF EAST ANGLIA




New research involving the University of East Anglia (UEA) suggests that countries’ current plans to remove CO2 from the atmosphere will not be enough to comply with the 1.5 ºC warming limit set out under the Paris Agreement.

Since 2010, the United Nations environmental organisation UNEP has taken an annual measurement of the emissions gap - the difference between countries’ climate protection pledges and what is necessary to limit global heating to 1.5 ºC, or at least below 2 ºC.

The UNEP Emissions Gap Reports are clear: climate policy needs more ambition. This new study now explicitly applies this analytical concept to carbon dioxide removal (CDR) - the removal of the most important greenhouse gas, CO2, from the atmosphere.

The study, published today in the journal Nature Climate Change, was led by the Berlin-based Mercator Research Institute on Global Commons and Climate Change (MCC) and involved an international team of scientists.

“In the Emissions Gap Reports, carbon removals are only accounted for indirectly,” said lead author Dr William Lamb, of the MCC Applied Sustainability Science working group.

“After all, the usual benchmark for climate protection pledges is net emissions, ie emissions minus removals. We are now making transparent the specific ambition gap in scaling up removals.

“This planetary waste management will soon place completely new requirements on policymakers and may even become a central pillar of climate protection in the second half of the century.”

Co-author Dr Naomi Vaughan, of the Tyndall Centre for Climate Change Research at UEA, added: “Carbon dioxide removal methods have a small but vital role to play in achieving net zero and limiting the impacts of climate change.

“Our analysis shows that countries need more awareness, ambition and action on scaling up CDR methods together with deep emissions reductions to achieve the aspirations of the Paris Agreement.”

According to the study, if national targets are fully implemented, annual human-induced carbon removals could increase by a maximum of 0.5 gigatonnes of CO2 (500 million tonnes) by 2030, and by a maximum of 1.9 gigatonnes by 2050.

This contrasts with the 5.1 gigatonne increase required in a ‘focus scenario’, which the research team depicts as typical from the latest Intergovernmental Panel on Climate Change (IPCC) assessment report.

There, global heating, calculated over the entire course of this century, is limited to 1.5 ºC, and a particularly rapid expansion of renewable energies and reduction of fossil emissions is depicted as the core climate protection strategy.

But, the focus scenario still relies on scaling up carbon removals. The gap for the year 2050 is therefore at least 3.2 gigatonnes of CO2 (5.1 minus a maximum of 1.9).

An alternative focus scenario, also derived from the IPCC, assumes a significant reduction in global energy demand, due to politically initiated behaviour changes as the core element of climate protection strategy.

Here, carbon removals would increase by a more modest amount: 2.5 gigatonnes in 2050. Fully implemented national targets would be close to sufficient when compared to this scenario, with a gap in 2050 of 0.4 gigatonnes.

The research team points out the problem of sustainability limits in scaling up carbon removals; for example, the associated land area demand will come to jeopardise biodiversity and food security. Nevertheless, there is still plenty of room for designing fair and sustainable land management policies.

In addition, novel carbon removal options, such as air filter systems, or ‘enhanced rock weathering’, have hardly been promoted by politicians to date.

They currently only remove 0.002 gigatonnes of CO2 per year from the atmosphere, compared to 3 gigatonnes through conventional options such as afforestation, and they are unlikely to significantly increase by 2030. According to the scenarios, they must become more prevalent than conventional options by 2010.

Since only 40 countries have so far quantified their removal plans in their long-term low emissions development strategies, the study also draws on other national documents and best-guess assumptions.

“The calculation should certainly be refined,” said Dr Lamb. “But our proposal using the focus scenarios further opens the discourse on how much carbon removal is necessary to meet the Paris Agreement.

“This much is clear: without a rapid reduction in emissions towards zero, across all sectors, the 1.5 ºC limit will not be met under any circumstances.”

The carbon dioxide removal gap’, Lamb, W, Gasser, T, Roman-Cuesta, R, Grassi, G, Gidden, M, Powis, C, Geden, O, Nemet, G, Pramata, Y, Riahi, K, Smith, S, Steinhauser, J, Vaughan, N, Smith, H, Minx, J, is published in Nature Climate Change on May 3.

New MSU research: Are carbon-capture models effective?



MICHIGAN STATE UNIVERSITY





MSU has a satellite uplink/LTN TV studio and Comrex line for radio interviews upon request.

EAST LANSING, Mich. – Reforestation efforts to restock depleted forests are important for addressing climate change and for both capturing and restoring carbon from the Earth’s atmosphere. These types of solutions to mitigate carbon emissions are critical after 2023 proved to be the warmest year on record. However, some models have been found to be inaccurate.

New research from Michigan State University has found the carbon removal potential of some reforestation models have been over exaggerated — and not by a small factor — but by as many as three times of a factor. The goal set by the Paris Agreement in 2015 for countries to limit their global warming to 1.5 degrees is now close to being surpassed.

Runsheng Yin is a professor in the Department of Forestry in the College of Agriculture and Natural Resources, where he specializes in the evaluation of ecosystem services, ecological restoration and general natural resource economics and policy. Yin has published new research identifying that carbon-capture models have not factored in what happens to the timber after trees are logged.

“The climate crisis is heightening, with 2023 representing the warmest year on record,” Yin said. “Nature-based solutions have an important role to play in helping us stem the worst impacts of climate change — but rigorously assessed methods to reliably offset and reduce greenhouse gas emissions could not be more urgent. As the practice becomes increasingly monetized, it is crucial that the accounting is done accurately.”

Yin examined a pine field in the Southern United States that is managed quite intensely. People or businesses can receive carbon credits when they reduce, avoid or remove greenhouse gas emissions.

He found that the amount of carbon offset credits a landowner can receive is exaggerated by a factor of at least 2.76, indicating the carbon reduced is not as efficient as previously thought. This research indicates that this overcalculation represents forestation beyond that pine field.

Specifically, the amount of time it takes for carbon to return to the atmosphere post-logging can differ depending on different wood products, whether that is paper, plywood or furniture. These types of wood degrade at different rates, and carbon must be stored for a specific time period for credits to be earned.

Alternatively, there also can be an issue when forests are kept intact, since people do not always consider the holding costs and the ability to use wood in lieu of fossil fuels, according to Yin.

“My research shows that its potential may not be as great as some analysts have claimed,” said Yin, when referring to carbon capturing. “That’s because existing studies have largely failed to conform to the accounting principles of the Paris Agreement, treat timber and carbon as joint products appropriately and consider how long each of the resulting wood products will store its carbon.”

The finding signals the need for more accurate and specific assessments for carbon storage, specifically, calculating the amount of time carbon that’s stored in wood products will last before returning to the atmosphere.

Forests will continue to play an important role in carbon sequestration, and companies should be encouraged to invest in reforestation to minimize carbon emissions — including receiving financial incentives.

Therefore, Yin recommends strengthening national and international efforts and even creating necessary programs to handle the assessment of carbon capturing. His research warns that it will not be enough to remove the amount of carbon created in timber, and these actions — which are so important — must receive the proper jurisdiction and evaluation of their effectiveness.

Read on MSUToday.

###

Michigan State University has been advancing the common good with uncommon will for more than 165 years. One of the world’s leading public research universities, MSU pushes the boundaries of discovery to make a better, safer, healthier world for all while providing life-changing opportunities to a diverse and inclusive academic community through more than 400 programs of study in 17 degree-granting colleges.

For MSU news on the web, go to MSUToday or twitter.com/MSUnews.

 

Seismic waves used to track LA’s groundwater recharge after record wet winter



SEISMOLOGICAL SOCIETY OF AMERICA




Record-setting storms in 2023 filled California’s major reservoirs to the brim, providing some relief in a decades-long drought, but how much of that record rain trickled underground?

Shujuan Mao of Stanford University and her colleagues used a surprising technique to answer this question for the greater Los Angeles metropolitan area. They analyzed changes in the velocity of seismic waves traveling through the LA basin, tracking these changes in space and time between January and October 2023.

As Mao reported at the Seismological Society of America (SSA)’s 2024 Annual Meeting, their study found that groundwater levels almost completely recovered at very shallow depths—about 50 meters below the surface. However, only about 25% of the groundwater lost over the past two decades was replenished at about 300 meters and deeper, likely because it is more challenging for stormwater to percolate into deeper layers of the earth.

“That means that a single epic year of storms is not enough to restore the groundwater depletion accumulated over the recent droughts. It takes many more wet years for the deep aquifer to fully recover,” Mao said.

Groundwater contributes more than 60% of the water supply used in California during drought conditions, she noted.

Mao and her colleagues are pioneering the use of seismic data to understand groundwater levels in the Los Angeles basin, as a complement to other methods used to measure groundwater levels. The most traditional method involves digging wells, which is expensive and only offers a “point scale measurement,” Mao explained. “You don’t know what the level is between two wells, or in other aquifer layers shallower or deeper relative to your well.”

More recently, researchers have been using satellite measurements to detect small changes in Earth’s surface deformation or gravity field, which works well to infer changes in groundwater storage over time and area. “These surface measurements couldn’t tell us what’s happening at different depths,” said Mao, "but that’s where we as seismologists can help.”

With data from 65 broadband seismographs in the Southern California Seismic Network, the researchers looked at changes in the propagation speed of seismic waves. The data they used are the “background” seismic vibrations generated by the oceans, winds, and human activity—not the seismic waves associated with earthquakes.

“These background seismic vibrations are continuous, which allow us to measure and monitor the seismic velocity changes continuously,” Mao said.

Seismic wave speed varies with the mechanical state of materials that the waves are passing through. When the groundwater level increases, the pressure in the porous space among rocks increases, and the seismic waves propagate slower through this porous rock. 

The researchers found that their estimates of groundwater storage calculated from seismic velocity change compared well with groundwater storage measurements from well and satellite data.

The researchers also found prominent aquifer replenishment in San Gabriel Valley and Raymond Basin, likely due to surface or subsurface flows from the San Gabriel mountains.

The combination of a dense seismic network and a pressing water shortage made Los Angeles “an ideal place to showcase how existing seismic data can contribute to the monitoring, understanding, and management of groundwater aquifers,” said Mao.

Seismic data would likely be integrated with many types of measurements to produce the comprehensive picture of groundwater dynamics needed to manage the valuable resource in a data-informed and sustainable manner, she added.

Mao, who will be an assistant professor at The University of Texas at Austin in August, said she will apply seismic techniques in Austin to monitor how aquifers in that region respond to artificial recharge operations.

UTA scientists test for quantum nature of gravity


Research at the south pole studied the mysterious quantum structure of space and time


UNIVERSITY OF TEXAS AT ARLINGTON

IceCube lab under the stars in Antarctica. Credit: Martin Wolf, IceCube/NSF 

IMAGE: 

ICECUBE LAB UNDER THE STARS IN ANTARCTICA. CREDIT: MARTIN WOLF, ICECUBE/NSF

view more 

CREDIT: CREDIT: MARTIN WOLF, ICECUBE/NSF




Einstein’s theory of general relativity explains that gravity is caused by a curvature of the directions of space and time. The most familiar manifestation of this is the Earth’s gravity, which keeps us on the ground and explains why balls fall to the floor and individuals have weight when stepping on a scale.

In the field of high-energy physics, on the other hand, scientists study tiny invisible objects that obey the laws of quantum mechanics—characterized by random fluctuations that create uncertainty in the positions and energies of particles like electrons, protons and neutrons. Understanding the randomness of quantum mechanics is required to explain the behavior of matter and light on a subatomic scale.

For decades, scientists have been trying to unite those two fields of study to achieve a quantum description of gravity. This would combine the physics of curvature associated with general relativity with the mysterious random fluctuations associated with quantum mechanics.

A new study in Nature Physics from physicists at The University of Texas at Arlington reports on a deep new probe into the interface between these two theories, using ultra-high energy neutrino particles detected by a particle detector set deep into the Antarctic glacier at the south pole.

“The challenge of unifying quantum mechanics with the theory of gravitation remains one of the most pressing unsolved problems in physics,” said co-author Benjamin Jones, associate professor of physics. “If the gravitational field behaves in a similar way to the other fields in nature, its curvature should exhibit random quantum fluctuations.”

Jones and UTA graduate students Akshima Negi and Grant Parker were part of an international IceCube Collaboration team that included more than 300 scientists from around the U.S., as well as Australia, Belgium, Canada, Denmark, Germany, Italy, Japan, New Zealand, Korea, Sweden, Switzerland, Taiwan and the United Kingdom.

To search for signatures of quantum gravity, the team placed thousands of sensors throughout one square kilometer near the south pole in Antarctica that monitored neutrinos, unusual but abundant subatomic particles that are neutral in charge and have no mass. The team was able to study more than 300,000 neutrinos. They were looking to see whether these ultra-high-energy particles were bothered by random quantum fluctuations in spacetime that would be expected if gravity were quantum mechanical, as they travel long distances across the Earth.

“We searched for those fluctuations by studying the flavors of neutrinos detected by the IceCube Observatory,” Negi said. “Our work resulted in a measurement that was far more sensitive than previous ones (over a million times more, for some of the models), but it did not find evidence of the expected quantum gravitational effects.”

This non-observation of a quantum geometry of spacetime is a powerful statement about the still-unknown physics that operate at the interface of quantum physics and general relativity.

“This analysis represents the final chapter in UTA’s nearly decade-long contribution to the IceCube Observatory,” said Jones. “My group is now pursuing new experiments that aim to understand the origin and value of the neutrinos mass using atomic, molecular and optical physics techniques.”