It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Friday, April 15, 2022
Amplification of light within aerosol particles accelerates photochemical reactions
AMERICAN ASSOCIATION FOR THE ADVANCEMENT OF SCIENCE (AAAS)
Amplification of light in atmospheric aerosol particles can affect their photochemistry, substantially accelerating sunlight-driven in-particle photochemical reactions, which are major contributors to the degradation and oxidation of matter in these particles. Although theorized, the new study provides direct evidence that the influence of optical confinement (OC) can create spatial structuring of the light intensity inside the particles and result in variations of photochemical rates. Given the newfound importance of the phenomenon on aerosol particle photochemistry, incorporating the effects of OC into atmospheric aerosol models could improve their ability to predict global chemistry and climate. Although tiny, atmospheric aerosol particles – suspensions of fine solid particles or liquid droplets in air – have a large influence on climate and air quality, and are key components of global atmospheric models. Photochemical reactions driven by aerosol exposure to sunlight are central to atmospheric chemistry, including the creation and control of pollutants and climate-influencing gases. Here, Pablo Corral Arroyo and colleagues report the effect of OC on the photochemical reactions within aerosol particles. While previous research has predicted the effects of OC on aerosol particle photochemistry, direct observation of the phenomenon has remained elusive. Using a combination of x-ray spectromicroscopic imaging and modeling of single iron(III)-citrate particles (a photochemically degrading aerosol), Corral Arroyo et al. show that aerosol particles can act as resonators for incoming solar radiation, resulting in an overall amplification and spatial structuring of light within the particles, thereby accelerating in-particle photochemistry. Using the findings, the authors predict that this could speed up photochemical reactions by a factor of two to three for most classes of atmospheric aerosol particles.
Liquid droplets and very fine particles can trap light – similar to how light can be caught between two mirrors. As a result, the intensity of the light inside them is amplified. This also happens in very fine water droplets and solid particles in our atmosphere, i.e. aerosols. Using modern X-ray microscopy, chemists at ETH Zurich and the Paul Scherrer Institute (PSI) have now investigated how light amplification affects photochemical processes that take place in the aerosols. They were able to demonstrate that light amplification causes these chemical processes to be two to three times faster on average than they would be without this effect.
Using the Swiss Light Source at the PSI, the researchers studied aerosols consisting of tiny particles of iron(III) citrate. Exposure to light reduces this compound to iron(II) citrate. X-ray microscopy makes it possible to distinguish areas within the aerosol particles composed of iron(III) citrate from those made up of iron(II) citrate down to a precision of 25 nanometres. In this way, the scientists were able to observe and map in high resolution the temporal sequence of this photochemical reaction in individual aerosol particles.
Decay upon exposure to light
“For us, iron(III) citrate was a representative compound that was easy to study with our method,” says Pablo Corral Arroyo, a postdoc in the group headed by ETH Professor Ruth Signorell and a lead author of the study. Iron(III) citrate stands in for a whole range of other chemical compounds that can occur in the aerosols of the atmosphere. Many organic and inorganic compounds are light-sensitive, and when exposed to light, they can break down into smaller molecules, which can be gaseous and therefore escape. “The aerosol particles lose mass in this way, changing their properties,” Signorell explains. Among other things, they scatter sunlight differently, which affects weather and climate phenomena. In addition, their characteristics as condensation nuclei in cloud formation change.
As such, the results also have an effect on climate research. “Current computer models of global atmospheric chemistry don’t yet take this light amplification effect into account,” ETH Professor Signorell says. The researchers suggest incorporating the effect into these models in future.
Non-uniform reaction times in the particles
Now precisely mapped and quantified, the light amplification in the particles comes about through resonance effects. The light intensity is greatest on the side of the particle opposite the one the light is shining on. “In this hotspot, photochemical reactions are up to ten times faster than they would be without the resonance effect,” says Corral Arroyo. Averaged over the entire particle, this gives an acceleration by the above-mentioned factor of two to three. Photochemical reactions in the atmosphere usually last several hours or even days.
Using the data from their experiment, the researchers were able to create a computer model to estimate the effect on a range of other photochemical reactions of typical aerosols in the atmosphere. It turned out that the effect does not pertain just to iron(III) citrate particles, but all aerosols – particles or droplets – made of compounds that can react with light. And these reactions are also two to three times faster on average.
AMERICAN ASSOCIATION FOR THE ADVANCEMENT OF SCIENCE (AAAS)
In a Policy Forum, Anna Lewis and colleagues argue that, for researchers and others who want to invoke genetic ancestry, there is a scientific and ethical imperative to move away from continental ancestry categories and to instead embrace a view of genetic ancestry that reflects continuous variation and historical depth. Such change is a “prerequisite to any research that looks for connections between genetics and health disparities,” the authors say. A continued reliance on continental ancestry categories risks exacerbating medical stereotypes about individuals and groups, contributing to health disparities rather than addressing them, and perpetuating (mis)understandings of race as biological, they add. Many research institutions are reconsidering their use of race as a biological variable and, instead, turning to concepts from genetics to capture differences between groups of humans. Genetic ancestry – a dominant description of genetic ancestry associated with continents as meaningful groupings – is one of the main proposed alternatives. However, the increasing dominance of the use of continental ancestry categories, such as “African ancestry” or “European ancestry,” has led to problematic ethical issues. Here, Lewis et al. highlight the ethical concerns surrounding the continued use of continental ancestry to group individuals. They argue instead for the widespread adoption of a more complex approach to genetic ancestry – a multidimensional, continuous, and category-free concept that reflects our historical depth and the total spectrum of human variation. Lewis et al. provide a roadmap to achieving this goal, a journey that will require a systems-level change, including new tools, methodologies, and data structures, and a fuller understanding of how and why different fields use and apply the concept of ancestry. “Science is reductive, and a model that uses simple continental categories has been useful in starting the process of understanding human genetic diversity,” write Lewis et al. “But all models have their legitimate domains of application and limits, and a much more complex set of models should now be the norm across a wide variety of use cases.”
AMERICAN ASSOCIATION FOR THE ADVANCEMENT OF SCIENCE (AAAS)
Declining nitrogen availability in many terrestrial ecosystems has widespread consequences for biodiversity and ecosystem functioning worldwide. In a Review, Rachel Mason and colleagues discuss the extent and consequences of nitrogen (N) decline, the human factors potentially driving it, and what might be done to help mitigate its damaging effects. “Akin to trends in atmospheric [carbon dioxide] and global temperatures, large-scale declines in N availability are likely to present long-term challenges that will require informed management and policy actions in the coming decade,” write Mason et al. Nitrogen is essential to life on Earth – it’s a key component of the plant proteins required to support the growth of plants and the animals that feed on them. Therefore, N availability in the environment has a strong influence on the structure and function of many ecosystems. While humans have more than doubled the global supply of reactive N through industrial and agricultural activities over the last century, much of this input has occurred in urban and agrarian areas. For areas not subject to anthropogenic N enrichment, however, a growing body of research suggests that N availability is declining in many terrestrial ecosystems worldwide. According to the authors, multiple environmental changes may be driving these declines, including elevated atmospheric carbon dioxide, rising global temperatures, and altered precipitation and ecosystem disturbance regimes. Reduced N availability can lead to changes in primary productivity in these ecosystems, impacting the diets of herbivores like insects, whose survival can have farther reaching impacts at higher trophic levels. Mason et al. highlight several ways that N decline can be monitored and mitigated but note that continued research is needed to inform management actions. “Given the potential implications of declining N availability for food webs, carbon sequestration, and other ecosystem functions and services, it is important that research, management, and policy actions be taken before the consequences of declining N availability become more severe,” write Mason et al.
UNIVERSITY OF MARYLAND CENTER FOR ENVIRONMENTAL SCIENCE
ANNAPOLIS, MD (April 15, 2022)—Since the mid-20th century, research and discussion has focused on the negative effects of excess nitrogen on terrestrial and aquatic ecosystems. However, new evidence indicates that the world is now experiencing a dual trajectory in nitrogen availability with many areas experiencing a hockey-stick shaped decline in the availability of nitrogen. In a new review paper in the journal Science, researchers have described the causes for these declines and the consequences on how ecosystems function.
“There is both too much nitrogen and too little nitrogen on Earth at the same time,” said Rachel Mason, lead author on the paper and former postdoctoral scholar at the National Socio-environmental Synthesis Center.
Over the last century, humans have more than doubled the total global supply of reactive nitrogen through industrial and agricultural activities. This nitrogen becomes concentrated in streams, inland lakes, and coastal bodies of water, sometimes resulting in eutrophication, low-oxygen dead-zones, and harmful algal blooms. These negative impacts of excess nitrogen have led scientists to study nitrogen as a pollutant. However, rising carbon dioxide and other global changes have increased demand for nitrogen by plants and microbes. In many areas of the world that are not subject to excessive inputs of nitrogen from people, long-term records demonstrate that nitrogen availability is declining, with important consequences for plant and animal growth.
Nitrogen is an essential element in proteins and as such its availability is critical to the growth of plants and the animals that eat them. Gardens, forests, and fisheries are almost all more productive when they are fertilized with moderate amounts of nitrogen. If plant nitrogen becomes less available, plants grow more slowly and their leaves are less nutritious to insects, potentially reducing growth and reproduction, not only of insects, but also the birds and bats that feed on them.
“When nitrogen is less available, every living thing holds on to the element for longer, slowing the flow of nitrogen from one organism to another through the food chain. This is why we can say that the nitrogen cycle is slowing down,” said Andrew Elmore, senior author on the paper and a professor of landscape ecology at the University of Maryland Center for Environmental Science and at the National Socio-environmental Synthesis Center.
Researchers reviewed long-term, global and regional studies and found evidence of declining nitrogen availability. For example, grasslands in central North America have been experiencing declining nitrogen availability for a hundred years, and cattle grazing these areas have had less protein in their diets over time. Meanwhile, many forests in North American and Europe have been experiencing nutritional declines for several decades or longer.
These declines are likely caused by multiple environmental changes, one being elevated atmospheric carbon dioxide levels. Atmospheric carbon dioxide has reached its highest level in millions of years, and terrestrial plants are exposed to about 50% more of this essential resource than just 150 years ago. Elevated atmospheric carbon dioxide fertilizes plants, allowing faster growth, but diluting plant nitrogen in the process, leading to a cascade of effects that lower the availability of nitrogen. On top of increasing atmospheric carbon dioxide, warming and disturbances, including wildfire, can also reduce availability over time.
Declining nitrogen availability is also likely constraining the ability of plants to remove carbon dioxide from the atmosphere. Currently global plant biomass stores nearly as much carbon as is contained in the atmosphere, and biomass carbon storage increases each year as carbon dioxide levels increase. However, declining nitrogen availability jeopardizes the annual increase in plant carbon storage by imposing limitations to plant growth. Therefore, climate change models that currently attempt to estimate carbon stored in biomass, including trends over time, need to account for nitrogen availability.
“The strong indications of declining nitrogen availability in many places and contexts is another important reason to rapidly reduce our reliance on fossil fuels,” said Elmore. “Additional management responses that could increase nitrogen availability over large regions are likely to be controversial, but are clearly an important area to be studied.”
In the meantime, the review paper recommends that data need be assembled into an annual state-of-the-nitrogen-cycle report, or a global map of changing nitrogen availability, that would represent a comprehensive resource for scientists, managers, and policy-makers.
"Evidence, Causes, and Consequences of Declining Nitrogen Availability in Terrestrial Ecosystems” was published in Science.
UNIVERSITY OF MARYLAND CENTER FOR ENVIRONMENTAL SCIENCE The University of Maryland Center for Environmental Science (UMCES) is a leading research and educational institution working to understand and manage the world’s resources. From a network of laboratories spanning from the Allegheny Mountains to the Atlantic Ocean, UMCES scientists provide sound advice to help state and national leaders manage the environment and prepare future scientists to meet the global challenges of the 21st century.
IMAGE: CHANGES IN THE NITROGEN CYCLE CAN BE DETECTED BY MONITORING ECOSYSTEM NITROGEN INPUTS, INTERNAL SOIL NITROGEN CYCLING, PLANT NITROGEN STATUS AND NITROGEN LOSSES.view more
CREDIT: RACHEL MASON
NEW YORK, April 15, 2022 – Since the mid-20th century, research and discussion have focused on the negative effects of excess nitrogen on terrestrial and aquatic ecosystems. However, new evidence indicates that the world is now experiencing a dual trajectory in nitrogen availability. Following years of attention to surplus nitrogen in the environment, our evolving understanding has led to new concerns about nitrogen insufficiency in areas of the world that do not receive significant inputs of nitrogen from human activities. In a new review paper, "Evidence, Causes, and Consequences of Declining Nitrogen Availability in Terrestrial Ecosystems,” in the journal Science, a multi-institutional team of researchers describes the causes of declining nitrogen availability and how it affects ecosystem function.
“There is both too much nitrogen and too little nitrogen on Earth at the same time,” said Rachel Mason, lead author on the paper and former postdoctoral scholar at the National Socio-Environmental Synthesis Center.
Over the last century, humans have more than doubled the global supply of reactive nitrogen through industrial and agricultural activities. This nitrogen becomes concentrated in streams, inland lakes, and coastal bodies of water, sometimes resulting in eutrophication, low-oxygen dead zones, and harmful algal blooms. These negative impacts of excess nitrogen have led scientists to study nitrogen as a pollutant. However, rising carbon dioxide and other global changes have increased demand for nitrogen by plants and microbes, and the research team’s newly published paper demonstrates that nitrogen availability is declining in many regions of the world, with important consequences for plant growth.
“These results show how the world is changing in complex and surprising ways,” said Peter Groffman, a co-author on the paper and a professor with the Advanced Science Research Center at the CUNY Graduate Center’s Environmental Science Initiative. “Our findings show the importance of having long-term data as well as focused synthesis efforts to understand these changes and the implications they have for ecosystem and human health and well-being.”
Researchers reviewed long-term global and regional studies and found evidence of declining nitrogen availability caused by multiple environmental changes, one being elevated atmospheric carbon dioxide levels. Atmospheric carbon dioxide has reached its highest level in millions of years, and terrestrial plants are exposed to about 50% more of this essential resource than just 150 years ago. Elevated atmospheric carbon dioxide fertilizes plants, allowing faster growth but diluting plant nitrogen in the process. These processes have been observed in experiments that artificially elevate carbon dioxide in the air around plants, and there is now evidence that plants in natural settings are responding in the same way.
Nitrogen is an essential element for plants and the animals that eat them. Gardens, forests, and fisheries are all more productive when they are fertilized with nitrogen. If plant nitrogen becomes less available, trees grow more slowly and their leaves are less nutritious to insects, potentially reducing growth and reproduction, not only of insects, but also the birds and bats that feed on them.
“When nitrogen is less available, every living thing holds on to the element for longer, slowing the flow of nitrogen from one organism to another through the food chain. This is why we can say that the nitrogen cycle is seizing up,” said Andrew Elmore, senior author on the paper, and a professor of landscape ecology at the University of Maryland Center for Environmental Science and at the National Socio-Environmental Synthesis Center.
On top of increasing atmospheric carbon dioxide, rising global temperatures also affect plant and microbial processes associated with nitrogen supply and demand. Warming often improves conditions for growth, which can result in longer growing seasons, leading plant nitrogen demand to exceed the supply available in soils. Disturbances, including wildfires, can also remove nitrogen from systems and reduce availability over time.
Nitrogen is an essential element for plant growth and its declining availability has the potential to constrain the ability of plants to remove carbon dioxide from the atmosphere. Currently, global plant biomass stores nearly as much carbon as is contained in the atmosphere, and biomass carbon storage increases each year. To the extent plant storage of carbon reduces atmospheric carbon dioxide, it contributes to reductions in the global warming potential of the atmosphere. However, declining nitrogen availability jeopardizes the annual increase in plant carbon storage by imposing limitations to plant growth. Therefore, climate change models that attempt to estimate carbon stored in biomass, including trends over time, need to account for nitrogen availability.
“Despite strong indications of declining nitrogen availability in many places and contexts, spatial and temporal patterns are not yet well enough understood to efficiently direct global management efforts,” said Elmore. In the future, these data could be assembled into an annual state of the nitrogen cycle report or a global map of changing nitrogen availability that would represent a comprehensive resource for scientists, managers, and policy-makers.
About the Advanced Science Research Center
The Advanced Science Research Center at the CUNY Graduate Center(CUNY ASRC) is a world-leading center of scientific excellence that elevates STEM inquiry and education at CUNY and beyond. The CUNY ASRC’s research initiatives span five distinctive, but broadly interconnected disciplines: nanoscience, photonics, neuroscience, structural biology, and environmental sciences. The center promotes a collaborative, interdisciplinary research culture where renowned and emerging scientists advance their discoveries using state-of-the-art equipment and cutting-edge core facilities.
About The Graduate Center of The City University of New York
The CUNY Graduate Center is a leader in public graduate education devoted to enhancing the public good through pioneering research, serious learning, and reasoned debate. The Graduate Center offers ambitious students nearly 50 doctoral and master’s programs of the highest caliber, taught by top faculty from throughout CUNY — the nation’s largest urban public university. Through its nearly 40 centers, institutes, initiatives, and the Advanced Science Research Center, the Graduate Center influences public policy and discourse and shapes innovation. The Graduate Center’s extensive public programs make it a home for culture and conversation.
As biotechnology advances, the risk of accidental or deliberate misuse of biological research like viral engineering is increasing. At the same time, “open science” practices like the public sharing of research data and protocols are becoming widespread. An article publishing April 14th in the open access journal PLOS Biology by James Smith and Jonas Sandbrink at the University of Oxford, UK, examines how open science practices and the risks of misuse interface and proposes solutions to the problems identified.
The authors grapple with a critically important issue that emerged with the advent of nuclear physics: how the scientific community should react when two values – security and transparency – are in conflict. They argue that in the context of viral engineering, open code, data, and materials may increase the risk of the release of enhanced pathogens. Openly available machine learning models could reduce the amount of time needed in the laboratory and make pathogen engineering easier.
To mitigate such catastrophic misuse, mechanisms that ensure responsible access to relevant dangerous research materials need to be explored. In particular, to prevent the misuse of computational tools, controlling access to software and data may be necessary.
Preprints, which have become widely used during the pandemic, make preventing the spread of risky information at the publication stage difficult. In response, the authors argue that oversight needs to take place earlier in the research lifecycle. Lastly, Smith and Sandbrink highlight that research preregistration, a practice promoted by the open science community to increase research quality, may harbor an opportunity to review and mitigate research risks.
“In the face of increasingly accessible methods for the creation of possible pandemic pathogens, the scientific community needs to take steps to mitigate catastrophic misuse,” say Smith and Sandbrink. “Risk mitigation measures need to be fused into practices developed to ensure open, high-quality, and reproducible scientific research. To make progress on this important issue, open science and biosecurity experts need to work together to develop mechanisms to ensure responsible research with maximal societal benefit.”
The authors propose several of those mechanisms, and hope that the research will spur innovation in this critically important yet critically neglected area. They show that science cannot be just open or closed: there are intermediate states that need to be explored, and difficult trade-offs touching on core scientific values may be needed. “In contrast to the strong narrative towards open science that has emerged in recent years, maximizing societal benefit of scientific work may sometimes mean preventing, rather than encouraging, its spread,” they conclude.
Citation: Smith JA, Sandbrink JB (2022) Biosecurity in an age of open science. PLoS Biol 20(4): e3001600. https://doi.org/10.1371/journal.pbio.3001600
Author Countries: United Kingdom
Funding: JAS received support from the Effective Altruism Funds programme via the Long Term Future Fund (https://funds.effectivealtruism.org/funds/far-future). JAS’s postdoctoral position is funded by the Oxford National Institute for Health Research Biomedical Research Centre (https://oxfordbrc.nihr.ac.uk/). JBS's doctoral research is funded by Open Philanthropy (https://www.openphilanthropy.org/). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Competing interests: I have read the journal’s policy and the authors of this manuscript have the following competing interests: in 2017 James Smith was a consultant for Biolacuna Ltd. He is currently a consultant to Alvea LLC.
For climate change mitigation, bipartisan politics can work
CU Boulder study: When Democrats and Republicans unite, more climate bills pass
In an increasingly polarized nation, cooperation across party lines is key to sustained climate mitigation in the United States, according to a new CIRES study. To sustain climate progress over decades, bipartisan cooperation on solutions like renewable energy or emissions reduction will be necessary, the authors say.
“In the long run, climate change mitigation will only be successful with public and political unity behind it,” said Renae Marshall, former CIRES researcher, Ph.D. student at the University of California Santa Barbara, and lead on the study out now in the current issue of Climatic Change. “Our study found that bipartisanship can help create working climate mitigation strategies in state-level contexts.”
Marshall and her coauthor Matt Burgess, CIRES Fellow and CU Boulder Assistant Professor in Environmental Studies and Economics, analyzed 418 state-government enacted bills and 450 failed bills aimed at reducing emissions from 2015 to 2020, to identify the political contexts in which they were passed or defeated.
The duo found that even though two-thirds of the climate-related bills passed in Democrat-controlled legislatures, one-third passed in Republican-controlled legislatures.
Additionally, about a third of all analyzed bills had cosponsors from both major parties, suggesting there are still opportunities for bipartisanship.
Marshall and Burgess found that bipartisan or Republican-led bills favored financial incentives for renewable energy, as well as legislation that expands consumer choices—whereas Democrat-led bills favored those that restricted choice, such as mandatory renewable energy and emissions standards. “Key bipartisan opportunities at the state level are policies that not only provide financial incentives (such as renewable energy system tax credits), but also have an element of expanding opportunities for businesses and consumers to take part in the renewable transition (creating new consumer protections and financing options or allowing new sources of energy to participate in the marketplace),” said Marshall.
For example: Georgia found success when the 2015 Solar Power Free-Market Finance Act lifted restrictions that had previously kept the solar market from growing by allowing individuals and businesses to participate in lease financing agreements.
More bipartisan bills were proposed in ‘divided’ states (like Kentucky or New Hampshire) compared to Democrat- or Republican-dominated states, the team said, suggesting that equal representation on both sides of the political spectrum creates a better environment for cooperation on climate bills.
“The more polarized we get, the more of a barrier there is. Working together across party lines is the solution,” said Marshall. “Bipartisanship opens up new opportunities to find common ground and dive into sustained climate initiatives.”
The study grew out of Marshall’s undergraduate Honors thesis in Environmental Studies at CU Boulder, which earned her recognition as the College of Arts and Sciences Outstanding Graduate in 2021.
“Bipartisanship has its challenges—but it’s worth it,” Burgess said. “Not only will more successful bipartisan bills increase the amount climate mitigation strategies put in place, bipartisanship in climate decisions might actually help shrink the political polarization in our country as a whole. Previous research has found that working towards shared goals reduces inter-group conflict.”
Historical data shows minorities have long faced obstacles to getting the critical health care services they need. When COVID-19 arrived two years ago, telemedicine emerged with the promise of better access to care through virtual delivery of clinical services and consultations.
But according to a new study led by the University of Houston College of Medicine and published in the Journal of General Internal Medicine, the rapid implementation of telemedicine didn’t bridge the gap as much as people had hoped.
“We found that racial and ethnic disparities persisted,” said lead study author Omolola Adepoju, a clinical associate professor at the UH College of Medicine and director of research at the Humana Integrated Health Sciences Institute at UH. “This suggests that the promise of the positive impact of telemedicine on health care use and health outcomes could elude underserved populations.”
Adepoju partnered with Lone Star Circle of Care, a federally qualified health center (FQHC) that caters to indigent, uninsured and underinsured, mostly minority populations, to examine what was driving those disparities. The research team examined electronic medical records from 55 individual clinics in 6 different counties in Texas.
“Our main finding was African Americans were 35% less likely to use telemedicine compared to whites,” Adepoju said. “And Hispanics were 51% less likely to use it.”
The reason, the study found, was a huge digital divide.
“The people who really need to access their primary care providers might be cut out [of telemedicine] because they don’t have the technology or might not know how to use it,” Adepoju said.
According to Adepoju, only one in four families earning $30,000 or less have smart devices, such as a phone, tablet, or laptop, compared to nearly three in four families earning $100,000 or more. And only 66% of African American and 61% of Hispanic households have access to broadband internet compared to 79% of white households.
The study also found that individuals younger than 18 years and older adults were less likely to have a telemedicine visit when compared to non-elderly adults, as were those covered under Medicaid coverage, or uninsured.
Another factor that played a role was how far from someone lived from a clinic.
“We observed a dose-response to geographic distance so that the further a patient lived, the higher the likelihood of telemedicine use,” Adepoju said. “The type of visit, whether for an acute or non-acute condition, was also associated with telemedicine use. Non-acute visits were more likely to be conducted via telemedicine.”
Despite the recent easing of COVID-19 restrictions and people returning to more in-person care, telemedicine is here to stay. The hope, according to Adepoju, is that minorities will be better educated and equipped to take advantage of it.
But they’ll need someone who can walk them through it to ensure their appointments are meaningful.
“Clinics will need a technology support system,” she said. “A staff that conducts pre-visit device and connectivity testing with patients can be instrumental to helping patients maximize telemedicine as an access to care option.
JOURNAL
Journal of General Internal Medicine
METHOD OF RESEARCH
Observational study
SUBJECT OF RESEARCH
People
ARTICLE TITLE
Utilization Gaps During the COVID-19 Pandemic: Racial and Ethnic Disparities in Telemedicine Uptake in Federally Qualified Health Center Clinics
Going beyond the limit: WVU researcher develops novel exposure assessment statistical methods for Deepwater Horizon oil spill study
IMAGE: THE 2010 DEEPWATER HORIZON OIL SPILL INVOLVED OVER 9,000 VESSELS DEPLOYED IN THE GULF OF MEXICO WATERS ACROSS ALABAMA, FLORIDA, LOUISIANA AND MISSISSIPPI AND TENS OF THOUSANDS OF WORKERS ON THE WATER AND ON LAND.view more
CREDIT: SUBMITTED PHOTO/NIEHS
Nearly 12 years after the Deepwater Horizon oil spill, scientists are still examining the potential health effects on workers and volunteers who experienced oil-related exposures.
To help shape future prevention efforts, one West Virginia University researcher – Caroline Groth, assistant professor in the School of Public Health’s Department of Epidemiology and Biostatistics – has developed novel statistical methods for assessing airborne exposure. Working with collaborators from multiple institutions, Groth has made it possible for researchers to characterize oil spill exposures in greater detail than has ever been done before.
With very few Ph.D. biostatisticians in the area of occupational health, there were few appropriate statistical methodologies for the assessment of inhalation exposures for the GuLF STUDY, a study launched by the National Institute of Environmental Health Sciences shortly after the Deepwater Horizon oil spill. The purpose of the study, which is the largest ever following an oil spill: examine the health of persons involved in the response and clean-up efforts. Groth was part of the exposure assessment team tasked with characterizing worker exposures and led by Patricia Stewart and Mark Stenzel.
Groth’s statistical methods, which she began in 2012, laid the framework for a crucial step for determining whether there are associations between exposures and health outcomes from the oil spill and clean-up work, which involved over 9,000 vessels deployed in the Gulf of Mexico waters across Alabama, Florida, Louisiana and Mississippi and tens of thousands of workers on the water and on land.
The Deepwater Horizon oil spill is considered the largest marine oil spill in the history of the U.S.
“Workers were exposed differently based on their activities, time of exposure, etc., and our research team’s goal was to develop exposure estimates for each of those scenarios and then link them to the participants’ work history through an ‘exposure matrix,’” Groth said.
These methods make it possible for other researchers to estimate individuals’ levels of exposure and link it to their health outcomes.
Additionally, Groth uncovered a new way of accounting for exposures that instruments cannot detect. The threshold at which exposures cannot be detected is referred to as the LOD, or limit of detection. Groth’s methods go beyond that limit, accounting for the fact that there is uncertainty in exposure measurements below the LOD.
“Basically, what happens is the instrument reports undetectable, or ‘zero,’" Groth explained. “Previously, less reliable approaches were likely used – such as replacing it with a single value or forecasting. Those approaches do not consider actual variability in the data which, if not considered, can lead to inaccurate exposure estimates. However, we know with certainty they cannot be ‘zero.'
“We know it’s between that threshold and zero, and there is likely variability in these measurements that we should account for. Our methods allow us to account for this variability and get a quantitative estimate of concentration.”
Her findings, along with her team’s, were recently published in the Annals of Work Exposures and Health. Dale Sandler, chief of the Epidemiology Branch and senior investigator at the National Institute of Environmental Health Sciences, said the efforts of Groth – who served as primary author for two of the manuscripts and co-author for eight published manuscripts – and her colleagues have opened new doors.
“The Gulf Long-term Follow-up Study is larger and more long-term than research on other oil spills, but the major defining feature of the study is the level of detail on potential oil-spill exposures and the extensive efforts made to characterize the exposures of those who helped to clean up following this environmental and potential public health disaster,” Sandler, principal investigator of the study, said. “Dr. Groth, who has played a key role in characterizing the chemical exposures of persons participating in the GuLF Study, and her colleagues have allowed us to characterize respiratory exposures to a broad class of chemicals resulting from the oil spill.”
Research continues to be conducted, with plans to continue following these workers for additional health effects going forward to determine if any exposures were associated with detrimental health outcomes. Both Groth and Sandler see the effort as an important step to identifying factors that contribute to long-term safety and health.
Sandler added, “This will help us identify links between specific exposures and health effects and could help us identify targets for future prevention efforts.”