It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Thursday, November 04, 2021
Condensation clue explains how plants sense changing autumnal temperature
As the weather cools in autumn, you may notice the process of condensation, with water droplets forming on windows.
Researchers have found that a similar process—biophysical condensation—is happening inside plants and allows them to sense fluctuating temperatures.
The ability to sense seasonal changes is crucial for plants to grow and reproduce at the right time of year.
First author of the research Dr. Pan Zhu says, "Our findings are helpful for understanding how plants sense fluctuating temperature signals—and it is temperature fluctuation that is predicted to become more extreme with climate change."
One gene in particular is important for how plants remember changing seasons: Flowering Locus C (FLC). It acts as brake on flowering which is lifted in spring, so the plant is ready to flower.
While a lot is known about how FLC is epigenetically switched off and stays turned off through winter ready for spring, less is known about the initial process known as 'transcriptional shutdown' when FLC DNA stops being used by the cell.
Now researchers have discovered how under cold temperatures a protein of a known activator of FLC forms liquid bubbles, called condensates, inside the nuclei of plants cells. The protein is called FRIGIDA. By forming condensates under cold temperatures, FRIGIDA is kept away from activating FLC.
When there are warm temperatures, the FRIGIDA protein is free to move back to FLC DNA and make sure that the brake on flowering remains. This stops the plant from flowering too early if there is lots of warm weather in autumn.
The researchers from the group of Professor Dame Caroline Dean at the John Innes Centre uncovered the biophysical mechanism using the model plant Arabidopsis thaliana.
This research which appears in Nature gives new insight into how plants sense fluctuating temperatures at a time when this knowledge is important for crop improvements in the face of climate change.
This newly discovered role of biological condensation in regulating the genetics of plants suggests that the process may be used in other ways in plants, to allow them to respond to other changes in the environment.
Professor Dean explains, "The dynamic molecular partitioning involving transcription regulators and noncoding RNA interactions is likely to be generally relevant for plant abiotic interactions -and therefore for crop productivity."
Experiments and findings in focus
The research team found high levels of FRIGIDA accumulate in the cold. Under closer inspection using microscopy they saw that it builds up in the nuclei of plant cells, where the DNA is housed, and forms biomolecular condensates. Biomolecular condensates are micron-scale compartments of concentrated proteins.
Further biochemical and visualization techniques, including using a timelapse temperature-controlled microscope, found that the protein condensates disappear within five hours of warm temperatures, but return after a six-hour period of cold. Experiments showed how condensation of FRIGIDA is responsible for the transcriptional shutdown at FLC.
The study also uncovered how the FRIGIDA condensates fit into the broader picture of regulation of FLC as Dr. Zhu explains, "Another interesting finding was that a specific isoform of long noncoding RNA COOLAIR, the antisense transcripts from FLC locus, contributes to the cold induced FRIGIDA nuclear condensate formation. This reveals one kind of mechanism for how COOLAIR-mediated FLC shutdown occurs during early vernalization."
For the first time, researchers from UCL Geography, the University of Nottingham and the UK Centre for Ecology & Hydrology combined large datasets with an environmental flow approach to predict how changes of between 1–3°C in the Earth's temperature would impact 321 of the world's biggest river basins. Collectively, these cover around 50% of the Earth's land surface.
The research, published in Earth's Future, showed increasing risks of ecological change with warming, particularly for seasonal low flow periods.
The team's findings could help target hotspots for ecosystem conservation, through better understanding of the ecological risks from climate change-induced modifications to river flow.
Ecological conditions within the world's rivers are strongly controlled by the amount, variability and timing of water flowing within them. Changes in river flows impact on river depth and velocity, water chemistry and habitats, with implications for aquatic life and the ecosystem services provided by rivers to humans.
Using results from nine global hydrological models forced by five climate models, the researchers compared simulated river flows under a range of global warming scenarios with historical flows. They were able to project which basins were more likely to experience significant ecosystem change due to altered river flow.
Rivers projected to be at most risk include the Amazon and Parana in South America, the Limpopo and Orange rivers in southern Africa and Australia's Darling River.
High latitude northern hemisphere basins were found to be less at risk, although the team say this may be underestimated as the effects of melting permafrost (ground that remains continuously frozen) is not represented in most global hydrological models.
Across the globe, the larger the temperature increase, the greater the risk of change, particularly for low flows at the highest level of warming.
Lead author Professor Julian Thompson (UCL Geography) said that "climate change is expected to lead to an intensification of the hydrological cycle. Yet the climate change signal and its consequences vary throughout the world as revealed by our analysis. In some parts of the world there is large uncertainty."
Co-author Professor Simon Gosling (School of Geography, University of Nottingham) added that "the risks for some rivers are particularly high if the 1.5°C and 2°C goals of the UN Paris Climate Agreement are not met. This study's results highlight the need for ambitious reductions in global greenhouse gas emissions to avoid ecological degradation of some of the world's largest rivers."
Professor Thompson explained that aquatic ecosystems underpin numerous ecosystem services that benefit human communities, whether that is for food, water supply, or water purification.
"If a river's regime changes, the associated ecosystem service will change" he said. "Natural aquatic ecosystems have processes that can improve water quality—reduced river flow impacts their ability to dilute pollutants. Higher flows might be more associated with changes in sediment loads, which could put water purification under stress. The life cycles of aquatic animals, in particular many fish species, are synchronized with seasonal variations in river flow so that changes in discharge could increase pressure on fisheries, many of which are already stressed."
The team have begun to expand their global analysis to include the impacts of human activities such as dams and water diversions. This will enable an attribution of future risks of ecological change between human interventions and climate change.
Professor Thompson noted that sometimes infrastructure, such as dams, could be repurposed to recreate more natural river regimes and mitigate climate change impacts, for example using artificial flood releases to mimic natural flows and maintain downstream environments, an option explored for African floodplains as part of earlier research by UCL Geography.
He added that "people would need to consider how you do tradeoffs between the environment, agriculture and other human demands for water. As the climate warms, you need more water for irrigation, but you may have less water in the river. How do you prioritize competing demands for a finite resource? It will require a lot of political will, underpinned by sound science."
More information:J. R. Thompson et al, Increasing Risk of Ecological Change to Major Rivers of the World With Global Warming,Earth's Future(2021).DOI: 10.1029/2021EF002048
Since the first reported pangolin seizure in Nigeria in 2010, the country has seen an explosion in the black market for the world’s most trafficked mammal – becoming Africa’s hub for the criminal export of pangolin products to East Asia.
Use of pangolin scales in traditional Chinese medicines has resulted in Asian species declining dramatically this century.
Now, a team of conservationists led by the University of Cambridge has produced the first data-driven study quantifying Nigeria-linked seizures of pangolin product, in order to gauge the size of this illicit trade.
Just those shipments intercepted and reported by authorities between 2010 and September 2021 amounted to 190,407 kilos of pangolin scales taken from at least 799,343 but potentially up to almost a million dead creatures.
This figure is close to recent estimates for the entire global pangolin trade since 2000 – suggesting levels of trafficking are far greater than previously thought.
Some seizures occurred in ports such as Hong Kong after leaving African shores. Researchers traced cargo from countries such as Cameroon and Gabon that was destined for Asian nations including China and Cambodia – sometimes travelling via France and Holland. All had been funnelled through Nigeria.
Of the 77 seizures analysed in the new study, 26 were uncovered alongside thousands of kilos of ivory – indicating that organised networks of pangolin traffickers are piggybacking on long-established ivory-smuggling connections.
Despite recent improvements and some dedicated officers, overall enforcement in Nigeria is lax and corruption endemic, say researchers. Total prosecutions for pangolin trafficking in Nigeria amount to just four – all in the last year.
As such, seized shipments are likely to represent a small fraction of the pangolin product now moved through Nigeria. The study, published in Biological Conservation, cites experts suggesting that detected wildlife seizures are anywhere from 30% to just 2% of the overall illegal trade.
“The figures in our research suggest there has been a gross underestimation of the scale of pangolin trafficking in Nigeria and indeed Africa as a whole, which could translate into mismatched anti-trafficking policies,” said lead researcher Charles Emogor from Cambridge’s Department of Zoology.
As well as a false belief in the curative power of their scales, eating pangolin meat is considered a status symbol in parts of Asia. Pangolin bodies are illegally traded at markets across China, and some studies have implicated sale of the animal’s meat in the origins of the COVID-19 pandemic.
All eight pangolin species – four African, four Asian – are listed as threatened, with three now considered critically endangered. Researchers randomly sampled dozens of sacks impounded by customs, and estimate that some 90% of the scales involved in Nigeria-linked trade are from white-bellied pangolins.
Among the more common African species, although still classed as vulnerable by conservation agencies, white-bellied pangolins are traditionally hunted and sold in local markets. Researchers now fear that international trafficking is driving the butchery of African pangolins to dangerous new heights.
“The levels of extraction hinted at by the hundreds of thousands of animals in seized shipments alone suggest that expanding trafficking networks driven by demand from Asia could ultimately jeopardise the survival of some African pangolin species,” said Emogor, who is also a Wildlife Conservation Society fellow.
Nigeria is signed up to various agreements that prohibit the hunting and commercial trade of pangolins, yet it has been involved in more reported trafficking incidents than any other African country.
Emogor and colleagues combed through the records of several domestic and international agencies as well as conducting interviews with Nigerian customs and intelligence officers working to try and curb wildlife trafficking.
The average mass of reported Nigeria-linked seizures increased steadily from 2010 before jumping sharply around 2017, when Nigeria secured its place as the nucleus of Africa’s pangolin trade, according to researchers. While the country initially acted as a conduit, by 2019 almost all shipments originated in Nigeria.
Pangolin cargo was trafficked via land and air, but the majority – some 65% of all scales – was shipped by sea, with maritime smuggling increasing over the years. Some seizures occurred in warehouses where mode of transport and destination were unknown, but all those taken in transit were likely bound for Asia.
The highest quantity of scales destined for any country or territory was Vietnam (over 64 kg), followed by China (over 48 kg) and Hong Kong (over 21 kg).
Two shipments uncovered this year had claws separated out from scales, suggesting traffickers are catering to shifting demands such as those for pangolin-claw amulets in China.
The researchers call for increased law enforcement efforts and mandatory training in the detection of illegal wildlife products for Nigerian customs officials, particularly at seaports, along with proper seizure documentation by Nigeria and surrounding nations.
“We would like to see a greater emphasis on the prosecution of apprehended traffickers as a deterrence,” added Emogor, who points out that traffickers were rarely arrested during confiscations in Nigeria, and of those that were, the vast majority had cases settled out of court.
CAPTION
Warehouse in Nigeria containing confiscated scales
CREDIT
Charles Emogor
CAPTION
Lead researcher Charles Emogor with a descaled pangolin carcass in his right hand, and the dead body of a young white-bellied African pangolin in his left hand.
CREDIT
Charles Emogor
CAPTION
Black-bellied pangolin scales confiscated by the Nigerian Customs Service.
CREDIT
Charles Emogor
CAPTION
A white-bellied African pangolin in Nigeria's Cross River National Park
The scale of Nigeria's involvement in the trans-national illegal pangolin trade: Temporal and spatial patterns and the effectiveness of wildlife trade regulations
ARTICLE PUBLICATION DATE
30-Oct-2021
Underground tests dig into how heat affects salt-bed repository behavior
Study to refine computer models, inform policymakers for future spent nuclear fuel disposal
ALBUQUERQUE, N.M. — Scientists from Sandia, Los Alamos and Lawrence Berkeley national laboratories have just begun the third phase of a years-long experiment to understand how salt and very salty water behave near hot nuclear waste containers in a salt-bed repository.
Salt’s unique physical properties can be used to provide safe disposal of radioactive waste, said Kristopher Kuhlman, a Sandia geoscientist and technical lead for the project. Salt beds remain stable for hundreds of millions of years. Salt heals its own cracks and any openings will slowly creep shut.
For example, the salt at the Waste Isolation Pilot Plant outside Carlsbad, New Mexico — where some of the nation’s Cold War-era nuclear waste is interred — closes on the storage rooms at a rate of a few inches a year, protecting the environment from the waste. However, unlike spent nuclear fuel, the waste interred at WIPP does not produce heat.
The Department of Energy Office of Nuclear Energy’s Spent Fuel and Waste Disposition initiative seeks to provide a sound technical basis for multiple viable disposal options in the U.S., and specifically how heat changes the way liquids and gases move through and interact with salt, Kuhlman said. The understanding gained from this fundamental research will be used to refine conceptual and computer models, eventually informing policymakers about the benefits of disposing of spent nuclear fuel in salt beds. Sandia is the lead laboratory on the project.
“Salt is a viable option for nuclear waste storage because far away from the excavation any openings are healed up,” Kuhlman said. “However, there’s this halo of damaged rock near the excavation. In the past people have avoided predicting the complex interactions within the damaged salt because 30 feet away the salt is a perfect, impermeable barrier. Now, we want to deepen our understanding of the early complexities next to the waste. The more we understand, the more long-term confidence we have in salt repositories.”
Trial-and-error in the first experiment
To understand the behavior of damaged salt when heated, Kuhlman and colleagues have been conducting experiments 2,150 feet underground at WIPP in an experimental area more than 3,200 feet away from ongoing disposal activity. They also monitor the distribution and behavior of brine, which is salt water found within the salt bed left over from an evaporated 250-million-year old sea. The little brine that is found in WIPP is 10 times saltier than seawater.
“Salt behaves much differently when it’s hot. If you heat up a piece of granite, it isn’t that different,” Kuhlman said. “Hot salt creeps much faster, and if it gets hot enough, the water in brine could boil off leaving a crust of salt on the waste container. Then that steam could move away until it gets cool enough to return to liquid and dissolve salt, possibly forming a complex feedback loop.”
In other words, the scientists are looking at whether the heat from spent nuclear fuel could help enclose waste containers, and even protect them from the corrosion that salty water can cause.
Planning for the experiment’s first phase began in 2017, using existing horizontal holes at WIPP. During this “shakedown” phase, researchers learned what equipment to use in subsequent experiments. For example, the first heater, which worked like a toaster, did not get the nearby salt hot enough to boil brine, said Phil Stauffer, a geoscientist with an expertise in combining computer models and real-world experiments who is leading Los Alamos National Laboratory’s contributions. However, the second heater the team tried, an infrared model, was effective; it worked more like the sun.
“When we put the first radiative heater into the first borehole, as part of the shakedown phase, it turns out the air didn’t allow the heat to efficiently move into the rock,” Stauffer said. “Then we switched to an infrared heater, and the heat moved through the air with little energy loss. In the early numerical simulations, naively we just put in heat; we didn’t worry about how the heat got from the heater into the rock.”
How brine and gases move through salt
During the experiment’s second phase, the team drilled two sets of 14 horizontal holes into the side of a hall and inserted more than 100 different sensors into the holes around the central horizontal hole containing the heater. These sensors monitored the sounds, strains, humidity and temperatures as the salt was heated and cooled.
Melissa Mills, a Sandia geochemist, made a special salt-concrete seal for testing the interactions between cement and brine.
Among the sensors used were almost 100 temperature sensors, like those found in home thermostats, so researchers could measure temperature through time at locations around the heater. Yuxin Wu, a geoscientist from Lawrence Berkeley National Laboratory, also installed fiber-optic temperature sensors, strain gauges and electrical resistivity imaging.
Charles Choens, a Sandia geoscientist, used special microphones, called acoustic-emissions sensors, to listen to the “pop” of salt crystals as they expand while heated and contract while cooling, Kuhlman said. The team used these microphones to triangulate the location of the popping salt crystals.
“Those pops are evidence of the transient permeability of the salt bed — the cracks between the salt crystals, which brine can percolate through.” Kuhlman said. “When you heat it up, it closes those little cracks. When the salt is hot, the permeability goes down, but when it cools down, the cracks temporarily open up and the permeability increases.”
To test the flow of gases through the damaged salt, the researchers injected small amounts of rare gases, such as krypton and sulfur hexafluoride, into one borehole and monitored their emergence in another, Kuhlman said. “When the salt was hot, the gases didn’t go anywhere. When we turned the heat off, the gases permeated the salt and came out in another borehole.”
Similarly, the team injected lab-made brine into one borehole with a small amount of the element rhenium and blue fluorescent dye as “tracers.” The team is monitoring for the emergence of the liquid in other boreholes, which will be sampled at the end of the test.
“The goal with the fluorescent dye — once we drill out post-test samples — is to map where the tracer went,” Mills said. “Obviously, we’ll be able to say that it went from one borehole to the other, if we detect a rhenium signal, but we won’t know the path it took. Also, brine will interact with minerals in the salt, like clay. The fluorescent dye is a visible way to identify where the liquid tracer actually went in the field.”
In the third phase, which began in mid-October, the team will be drilling a new array of nine heated boreholes, building on what they learned in the prior phases of the experiments.
Working in challenging conditions underground
The team has learned a lot from the first two phases of the experiment, including the best heater type, when to drill the boreholes and just how corrosive the brine is, Stauffer and Mills said.
“The first two phases involved a lot of equipment testing; some has failed, and some was sent back to the manufacturer,” Mills said. “We’ve also learned to keep back-up equipment on hand because salt dust and brine destroys equipment. We need to double-seal things because the brine can seep down insulated wire and then equipment dies. It’s been a process to learn how to work in the salt environment.”
Kuhlman agreed. “Many things can go wrong when you take sensitive lab equipment and put it in a salt mine. We went back and read the reports from the WIPP experiments in the ’80s. We want to learn from the past, but sometimes we have had to make our own mistakes.”
The researchers are collaborating with international partners to use the data from this project to improve computer models of the complex chemical, temperature, water-based and physical interactions that take place underground. This will improve future modeling of nuclear waste repositories globally.
Ultimately, the team would like to scale up to larger and longer experiments to obtain data relevant to future salt repositories, said Kuhlman and Stauffer. These data, supplementing already collected data, would inform repository designers and policymakers about the safety of permanently disposing heat-generating nuclear waste in salt repositories.
“It’s been really intriguing and interesting, for me, to work on a project that is so hands-on,” Mills said. “Getting to design and build the systems and going underground into WIPP has been really rewarding. Doing research in an active mine environment can be a challenge, but I’ve been proud to work down there and implement our ideas.”
CAPTION
Melissa Mills, left, a Sandia National Laboratories geochemist, and Kristopher Kuhlman a Sandia geoscientist, display salt samples from their Waste Isolation Pilot Plant experimental site. They have just begun the third phase of a yearslong basic science experiment to understand how salt and very salty water behave near hot nuclear waste containers in a salt-bed repository.
CREDIT
Photo by Randy Montoya/Sandia National Laboratories
Sandia National Laboratories is a multimission laboratory operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energy’s National Nuclear Security Administration. Sandia Labs has major research and development responsibilities in nuclear deterrence, global security, defense, energy technologies and economic competitiveness, with main facilities in Albuquerque, New Mexico, and Livermore, California.
BUILD AIRSHIPS
Aviation’s present-day contribution to human-induced global warming is 4% and will increase over the next 30 years should pre-Covid growth resume
Major new study reveals that aviation could consume up to one-sixth of the remaining temperature budget to limit warming to 1.5 ˚C
Aviation is responsible for more global warming than implied by its carbon footprint alone. According to new research published today, aviation could consume up one-sixth of the remaining temperature budget required to limit warming to 1.5˚C by 2050. The article, published in Environmental Research Letters, suggests that emissions produced by the aviation industry must be reduced each year if the sector’s emissions are not to increase warming further.
Given that aviation is widely recognized as a sector which is challenging to decarbonise, this research aims to inform the discussion about aviation’s ‘fair share’ of future warming.
The researchers behind the study, based at the University of Oxford, Manchester Metropolitan University, and the NERC National Centre for Earth Observation, developed a simple technique for quantifying the temperature contribution of historical aviation emissions, including both CO2 and non-CO2 impacts[1]. It also projects future warming due to aviation based on a range of possible solutions to the climate crisis.
Milan Klöwer, lead author of the study said: “Our results show that aviation’s contribution to warming so far is approximately 4% and is increasing. COVID reduced the amount people fly, but there is little chance for the aviation industry to meet any climate target if it aims for a return to normal.”
The authors show that the only way to ‘freeze’ the temperature increase from the sector is to strongly decline CO2 emissions by about 2.5% per year; however, there is room for optimism as they also show that ensuring a 90% mix of low carbon sustainable fuels by 2050 would achieve a similar outcome, with no further temperature increase from the sector. But this relies on a sustainable production chain of low-carbon fuels that does not exist yet, as Milan Klöwer points out. “The aviation industry has to come up with a credible plan for a 1.5˚C world.”
“Any growth in aviation emissions has a disproportionate impact, causing lots of warming”, says Professor Myles Allen, co-author of the study. “But any decline also has a disproportionate impact in the other direction. So the good news is that we don’t actually need to all stop flying immediately to stop aviation from causing further global warming – but we do clearly need a fundamental change in direction now, and radical innovation in the future.”
Co-Author Professor David Lee, Manchester Metropolitan University, adds, “These are important results that show stylized pathways of how we can get to where we need to be with aviation emissions, robustly showing the different roles of CO2 and non-CO2 impacts. One of the important nuances is that the non-CO2 impacts, like the formation of contrails and cloudiness, have been thought to dominate the total impact: this is true at present, but it’s not widely understood in the stakeholder community that if you take care of CO2, the non-CO2 fraction decreases in importance, even more so with sustainable alternative fuels that generate fewer contrails. This emphasizes the importance of tackling aviation’s CO2 emissions.”
The aviation industry has only recently begun to tackle the warming effect of flying, and this study is timely for quantifying that impact. The solutions discussed in this study, such as moving to alternative fuels, present a clear pathway to minimising warming but these will take time to implement. In the short-term, there are actions that the industry can take right now. Dr Simon Proud, of the National Centre for Earth Observation and RAL Space, suggests, “A ban on fuel tankering - where aircraft carry more fuel than they need, and hence burn extra fuel, to save the cost of refuelling at the destination - would reduce CO2 emissions in Europe alone by almost one million tonnes.” Other solutions, such as more efficient air traffic control and minimising holding patterns at airports would also reduce emissions and help keep future warming minimal.
[1] Aviation contributes to global warming from its CO2 emissions and a range of other non-CO2 effects from e.g. emissions of nitrogen oxides, water vapour and particles, that alter the chemical balance of the atmosphere and affect cloudiness, which can increase the net warming signal from aviation, as previously quantified by the authors https://doi.org/10.1016/j.atmosenv.2020.117834
GLASGOW, Scotland – Senior governmental officials from Scotland and California met today during COP26 to discuss how optical greenhouse-gas-monitoring instruments can play a key role in supporting reduced emissions in cities. California Lieutenant Governor Eleni Kounalakis and Scottish Government Cabinet Secretary for the Constitution, External Affairs and Culture Angus Robertson convened at the University of Strathclyde in Glasgow to learn how real-time monitoring of greenhouse gas (GHG) and pollution emission data can provide local leaders with essential information to support strategic policy decisions.
Over the past two years, scientists and policymakers from Scotland and California have collaborated on an Urban Air Project through the Global Environmental Measurement & Monitoring (GEMM) Initiative, a joint international project of scientific societies Optica (formerly OSA) and the American Geophysical Union (AGU). The GEMM Urban Air Project centers on deploying a dense network of 25 sensors across Glasgow to monitor levels of GHGs, air pollutants and particulate matter in real-time. A similar network has been installed in the San Francisco Bay Area, using the same technology, which was created by University of California, Berkeley (UC Berkeley) Professor Ron Cohen. This international effort is a result of a collaboration between the University of Strathclyde, Glasgow City Council, Stanford University, the UC Berkeley, Optica, the AGU, the Met Office and the National Physical Laboratory (NPL).
“Cities can make a substantial impact in GHG emissions reductions since they currently contribute more than 70% of the total. Until now, local governments have had limited access to timely data to help guide their GHG reduction policies,” said Tom Baer, GEMM co-lead and Director of Stanford Photonics Research Center at Stanford University, USA. “With these low-cost, real-time compact optical sensors deployed around the city, for the first time, leaders will have actionable science in hand to support strategic policy decisions.”
The sensor network approach uses inverse modeling based on local atmospheric circulation weather models to pinpoint the location of GHG emissions and air pollution sources in the city, throughout the day and across seasons. This localized modeling can provide first-of-its kind day-to-day neighborhood scale forecasts for cities and regions, which can be used to model future scenarios and as a way to assess current policies that aim to reduce GHG and air pollution emissions.
“Having concrete information is the first step in addressing climate action on a local level,” said Baer. “Today’s important meeting between Lieutenant Governor Kounalakis and Cabinet Secretary for the Constitution, External Affairs and Culture Robertson demonstrates how the GEMM Initiative enables leaders around the globe to take concrete action to address environmental issues in their individual communities.”
Optica (formerly OSA) is dedicated to promoting the generation, application, archiving and dissemination of knowledge in optics and photonics worldwide. Founded in 1916, it is the leading organization for scientists, engineers, business professionals, students and others interested in the science and application of light. Optica’s renowned publications, meetings, online resources and in-person activities fuel discoveries, shape real-life applications and accelerate scientific, technical and educational achievement.
Nobody is currently taking continuous, routine measurements of the particles suspended in America’s air, called aerosols. That is set to change as a new, nationwide monitoring network launches with a site in Riverside, California.
When American scientists want information about the aerosols, they have to collect samples and ship them to a laboratory for analysis. The samples are typically collected every three to five days, which is suboptimal for understanding air quality events that happen more frequently.
“You want a real-time look at what’s happening, not a piecemeal puzzle picture,” said Roya Bahreini, UCR professor of atmospheric science and co-leader of the monitoring project.
Airborne particles can affect the climate, Earth’s ecosystems, and human health. Without understanding their nature — what they are, how often they appear, where they come from, their quantity and origin — efforts to mitigate them aren’t as effective.
For these reasons, the National Science Foundation has granted $12 million for the next three years to the Atmospheric Science and mEasurement NeTwork, or ASCENT project, whose principal investigator is Nga-Lee “Sally” Ng, chemical engineering professor at the Georgia Institute of Technology. The network establishes state-of-the-art aerosol monitoring at 12 sites in the U.S., spread among urban and remote environments. Three of the sites are in Southern California.
Locally, Bahreini is overseeing the installation of new monitoring equipment at the South Coast Air Quality Management District’s Rubidoux monitoring site in Riverside, a good spot for gathering data about particulate matter that floats inland from the Los Angeles metro area.
Data collected at Rubidoux, along with data from Pico Rivera and Joshua Tree National Park, will allow scientists to investigate changes in aerosol properties as they are transported away. “We know air flows bring pollution inland,” Bahreini said. “That’s why we believe this spot will be interesting for epidemiologists who want to see how this aged air pollution is impacting the health of local people.”
With the increase in Southern California wildfires, phone apps that offer air quality information have seen a surge in popularity. However, Bahreini explains that those services offer an idea of the total concentration of aerosols, rather than specifically what they are made of, their size, or their age.
Some instruments being installed at Rubidoux will offer data about the airborne amounts of sulfate, ammonium, nitrates, chloride, trace metals, and soot, or black carbon. Others will measure the size distribution of various aerosols.
Differently sized aerosols can have different impacts on our health. In addition, size can indicate something about the way the particles are formed.
“Larger-sized particles have been in the atmosphere for a while and accumulated components from other aerosols or condensable gases,” Bahreini said. “If we’re comparing aerosols in Pico Rivera to those in Riverside, we want to know their size. If they’ve grown, what has led to this growth?”
To make the data as widely available as possible, Bahreini will help train officials from the South Coast Air Quality Management District in the use of the new instruments, and a website with the real-time data from all the sites will be publicly accessible.
Ultimately, Bahreini hopes that the ASCENT partnership and establishment of a national aerosol monitoring infrastructure will open pathways for future research by atmospheric chemistry and climate scientists, air quality modelers, and epidemiologists.
“We are much more likely to be able to control what we can understand,” she said. “Data from this network will help us truly understand the influence of infrequent events on our air quality. Long-term trends in the data are also critical for formulating new policies to better protect human health and the climate.”