Thursday, November 19, 2020

 3D-printed, lifelike heart models could help train tomorrow's surgeons (video)

AMERICAN CHEMICAL SOCIETY

Research News

IMAGE

IMAGE: RESEARCHERS HAVE DEVELOPED A WAY TO 3D PRINT A FULL-SIZE MODEL OF A PATIENT'S OWN HEART. view more 

CREDIT: AMERICAN CHEMICAL SOCIETY

Full-size, realistic models of human organs could help surgeons train and practice before they cut into a patient. However, it's been challenging to make inexpensive models of a size, complexity and material that simulates human organs. Now, researchers reporting in ACS Biomaterials Science & Engineering have developed a way to 3D print a full-size model of a patient's own heart. Watch a video of how they made the 3D organ here.

For complex heart surgeries, having a chance to plan and practice on a realistic model could help surgeons anticipate problems, leading to more successful outcomes. Current 3D printing techniques have been used to make full-size organ models, but the materials generally don't replicate the feel or mechanical properties of natural tissue. And soft, tissue-like materials, such as silicone rubbers, often collapse when 3D printed in air, making it difficult to reproduce large, complex structures. Eman Mirdamadi, Adam Feinberg and colleagues recently developed a technique, called freeform reversible embedding of suspended hydrogels (FRESH), which involves 3D printing soft biomaterials within a gelatin bath to support delicate structures that would otherwise collapse in air. However, the technique was previously limited to small objects, so the researchers wanted to adapt it to full-size organs.

The team's first step was to show that alginate, an inexpensive material made from seaweed, has similar material and mechanical properties as cardiac tissue. Next, the researchers placed sutures in a piece of alginate, which held even when stretched -- suggesting that surgeons could practice stitching up a heart model made from the material. In preparation for making the heart model, the team modified their FRESH 3D printer to make larger objects. They used this device and magnetic resonance imaging (known as MRI) scans from a patient to model and print a full-size adult human heart, as well as a section of coronary artery that they could fill with simulated blood. The heart model was structurally accurate, reproducible and could be handled outside of the gelatin bath. The method could also be applied to printing other realistic organ models, such as kidneys or liver, the researchers say.

The authors acknowledge funding from the Office of Naval Research, the U.S. Food & Drug Administration and the National Institutes of Health.

The abstract that accompanies this paper can be viewed here.

The American Chemical Society (ACS) is a nonprofit organization chartered by the U.S. Congress. ACS' mission is to advance the broader chemistry enterprise and its practitioners for the benefit of Earth and its people. The Society is a global leader in providing access to chemistry-related information and research through its multiple research solutions, peer-reviewed journals, scientific conferences, eBooks and weekly news periodical Chemical & Engineering News. ACS journals are among the most cited, most trusted and most read within the scientific literature; however, ACS itself does not conduct chemical research. As a specialist in scientific information solutions (including SciFinder® and STN®), its CAS division powers global research, discovery and innovation. ACS' main offices are in Washington, D.C., and Columbus, Ohio.   To automatically receive news releases from the American Chemical Society, contact newsroom@acs.org.   Follow us: Twitter | Facebook

3D bioprinted heart provides new tool for surgeons

COLLEGE OF ENGINEERING, CARNEGIE MELLON UNIVERSITY

Research News

IMAGE

IMAGE: A 3D BIOPRINTED HEART MODEL DEVELOPED BY ADAM FEINBERG AND HIS TEAM. view more 

CREDIT: CARNEGIE MELLON UNIVERSITY COLLEGE OF ENGINEERING

Professor of Biomedical Engineering Adam Feinberg and his team have created the first full-size 3D bioprinted human heart model using their Freeform Reversible Embedding of Suspended Hydrogels (FRESH) technique. Showcased in a recent video by American Chemical Society and created from MRI data using a specially built 3D printer, the model mimics the elasticity of cardiac tissue and sutures realistically. This milestone represents the culmination of two years of research, holding both immediate promise for surgeons and clinicians, as well as long term implications for the future of bioengineered organ research.

The FRESH technique of 3D bioprinting was invented in Feinberg's lab to fill an unfilled demand for 3D printed soft polymers, which lack the rigidity to stand unsupported as in a normal print. FRESH 3D printing uses a needle to inject bioink into a bath of soft hydrogel, which supports the object as it prints. Once finished, a simple application of heat causes the hydrogel to melt away, leaving only the 3D bioprinted object.

While Feinberg's lab has proven both the versatility and the fidelity of the FRESH technique, the major obstacle to achieving this milestone was printing a human heart at full scale. This necessitated the building of a new 3D printer custom made to hold a gel support bath large enough to print at the desired size, as well as minor software changes to maintain the speed and fidelity of the print.

Major hospitals often have facilities for 3D printing models of a patient's body to help surgeons educate patients and plan for the actual procedure, however these tissues and organs can only be modeled in hard plastic or rubber. Feinberg's team's heart is made from a soft natural polymer called alginate, giving it properties similar to real cardiac tissue. For surgeons, this enables the creation of models that can cut, suture, and be manipulated in ways similar to a real heart. Feinberg's immediate goal is to begin working with surgeons and clinicians to fine tune their technique and ensure it's ready for the hospital setting.

"We can now build a model that not only allows for visual planning, but allows for physical practice," says Feinberg. "The surgeon can manipulate it and have it actually respond like real tissue, so that when they get into the operating site they've got an additional layer of realistic practice in that setting."

This paper represents another important marker on the long path to bioengineering a functional human organ. Soft, biocompatible scaffolds like that created by Feinberg's group may one day provide the structure onto which cells adhere and form an organ system, placing biomedicine one step closer to the ability to repair or replace full human organs.

"While major hurdles still exist in bioprinting a full-sized functional human heart, we are proud to help establish its foundational groundwork using the FRESH platform while showing immediate applications for realistic surgical simulation," added Eman Mirdamadi, lead author on the publication.

Published in ACS Biomaterials Science and Engineeringthe paper was co-authored by Feinberg's students Joshua W. Tashman, Daniel J. Shiwarski, Rachelle N. Palchesko, and former student Eman Mirdamadi.

CAPTION

Modeling incorporates imaging data into the final 3D printed object.


CAPTION

A needle prints the alginate into a hydrogel bath, which is later melted away to leave the finished model.

Study identifies reasons for soaring nuclear plant cost overruns in the US

Analysis points to ways engineering strategies could be reimagined to minimize delays and other unanticipated expenses

MASSACHUSETTS INSTITUTE OF TECHNOLOGY

Research News

A new analysis by MIT researchers details many of the underlying issues that have caused cost overruns on new nuclear power plants in the U.S., which have soared ever higher over the last five decades. The new findings may help the designers of new plants build in resilience to the factors that tend to cause these overruns, thus helping to bring down the costs of such plants.

Many analysts believe nuclear power will play an essential part in reducing global emissions of greenhouse gases, and finding ways to curb these rising costs could be an important step toward encouraging the construction of new plants, the researchers say. The findings are being published in the journal Joule, in a paper by MIT professors Jessika Trancik and Jacopo Buongiorno, along with former students Philip Eash-Gates SM '19, Magdalena Klemun PhD '20, Goksin Kavlak PhD '18, and Research Scientist James McNerney.

Among the surprising findings in the study, which covered 50 years of U.S. nuclear power plant construction data, was that, contrary to expectations, building subsequent plants based on an existing design actually costs more, not less, than building the initial plant.

The authors also found that while changes in safety regulations could account for some of the excess costs, that was only one of numerous factors contributing to the overages.

"It's a known fact that costs have been rising in the U.S. and in a number of other locations, but what was not known is why and what to do about it," says Trancik, who is an associate professor of energy studies in MIT's Institute for Data, Systems and Society. The main lesson to be learned, she says, is that "we need to be rethinking our approach to engineering design."

Part of that rethinking, she says, is to pay close attention to the details of what has caused past plant construction costs to spiral out of control, and to design plants in a way that minimizes the likelihood of such factors arising. This requires new methods and theories of technological innovation and change, which the team has been advancing over the past two decades.

For example, many of the excess costs were associated with delays caused by the need to make last-minute design changes based on particular conditions at the construction site or other local circumstances, so if more components of the plant, or even the entire plant, could be built offsite under controlled factory conditions, such extra costs could be substantially cut.

Specific design changes to the containment buildings surrounding the reactor could also help to reduce costs significantly, Trancik says. For example, substituting some new kinds of concrete in the massive structures could reduce the overall amount of the material needed, and thus slash the onsite construction time as well as the material costs.

Many of the reasons behind the cost increases, Trancik says, "suggest that there's a lack of resilience, in the process of constructing these plants, to variable construction conditions." Those variations can come from safety regulations that are changing over time, but there are other reasons as well. "All of this points to the fact that there is a path forward to increasing resilience that involves understanding the mechanisms behind why costs increased in the first place."

Say overall construction costs are very sensitive to upfront design costs, for example: "If you're having to go back and redo the design because of something about a particular site or a changing safety regulation, then if you build into your design that you have all of these different possibilities based on these things that could happen," that can protect against the need for such last-minute redesign work.

"These are soft costs contributions," Trancik says, which have not tended to be prioritized in the typical design process. "They're not hardware costs, they are changes to the environment in which the construction is happening. ... If you build that in to your engineering models and your engineering design process, then you may be able to avoid the cost increases in the future."

One approach, which would involve designing nuclear plants that could be built in factories and trucked to the site, has been advocated by many nuclear engineers for years. For example, rather than today's huge nuclear plants, modular and smaller reactors could be completely self-contained and delivered to their final site with the nuclear fuel already installed. Numerous such plants could be ganged together to provide output comparable to that of larger plants, or they could be distributed more widely to reduce the need for long-distance transmission of the power. Alternatively, a larger plant could be designed to be assembled on site from an array of smaller factory-built subassemblies.

"This relationship between the hardware design and the soft costs really needs to be brought into the engineering design process," she says, "but it's not going to happen without a concerted effort, and without being informed by modeling that accounts for these potential ballooning soft costs."

Trancik says that while some of the steps to control costs involve increased use of automated processes, these need to be considered in a societal context. "Many of these involve human jobs and it is important, especially in this time, where there's such a need to create high-quality sustained jobs for people, this should also factor into the engineering design process. So it's not that you need to look only at costs." But the kind of analysis the team used, she says, can still be useful. "You can also look at the benefit of a technology in terms of jobs, and this approach to mechanistic modeling can allow you to do that."

The methodology the team used to analyze the causes of cost overruns could potentially also be applied to other large, capital-intensive construction projects, Trancik says, where similar kinds of cost overruns often occur.

"One way to think about it as you're bringing more of the entire construction process into manufacturing plants, that can be much more standardized." That kind of increased standardization is part of what has led, for example, to a 95 percent cost reduction in solar panels and in lithium-ion batteries over the last few decades, she says. "We can think of it as making these larger projects more similar to those manufacturing processes."

Buongiorno adds that "only by reducing the cost of new plants can we expect nuclear energy to play a pivotal role in the upcoming energy transformation."

###

The work was supported by the David and Lucille Packard Foundation and the MIT Energy Initiative.

Written by David L. Chandler, MIT News Office

Mattress flammability standard is a lifesaver, NIST report 

NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY (NIST)

Research News

IMAGE

IMAGE: THE TEST SETUP DESCRIBED IN 16 CFR PART 1633 ENTAILS APPLYING TO GAS BURNER HEADS TO THE SIDE AND TOP OF A MATTRESS FOR UP TO 30 MINUTES. view more 

CREDIT: NIST/B. HAYES

No matter how soft and cozy, beds that have gone up in flames are a source of some of the deadliest fires in the U.S. As large pieces of furniture loaded with combustible cushioning materials, beds are substantial fuel sources for home fires. Once ignited, mattress fires can grow quickly, creating life-threatening situations in bedrooms or entire houses within minutes.

A 2007 standard for mattress flammability from the Consumer Product Safety Commission (CPSC), known as 16 CFR Part 1633, sought to curb the danger of bed fires sparked by flames, which caused an estimated 95 deaths annually from 2002-2005. But because of how infrequently consumers replace mattresses, the researchers who helped to develop the standard spent years in the dark about whether the safety requirements were living up to expectations.

Now, enough data has accumulated for researchers at the National Institute of Standards and Technology (NIST) to estimate that the standard prevented 65 deaths from bed fires annually in 2015 and 2016. That number is expected to rise as more mattresses are replaced with the newer, standard-compliant models.

"What we've got here is a clear case of fire researchers, manufacturers and regulators all working together, getting the science right, getting commercially acceptable versions of the mattresses right and getting the regulation right," said NIST research scientist Richard Gann. "It all came together, and as a result we have a real success story for the country."

Long before 2007, other standards were in place to crack down on a leading cause of bed fires in cigarettes, but they left the grave threat of flaming ignition sources, like lighters, matches or pieces of burning furniture, largely unaddressed.

To close that gap, the International Sleep Products Association (ISPA) -- the trade association for the mattress industry -- approached NIST about laying the groundwork for a new mattress flammability standard that would eliminate, or at least greatly reduce, the casualties from bed fires.

Gann and his colleagues leapt at the opportunity and set out to devise a realistic and practical way for manufacturers to test mattress flammability.

Since bed fires typically start with the ignition of blankets, sheets and other bedclothing items, the researchers aimed to replicate the danger they posed to mattresses. Gann and his team assembled several sets of bedclothes, set them ablaze and gauged the heat release rate (HRR) -- an indicator of how intensely something burns, measured in watts -- of each.

They used the HRR data to create a special test apparatus composed of twin propane burners that could mimic fires generated by an off-the-shelf set of bedclothes. With the burners, manufacturers could test their mattresses against conditions similar to real-world bedroom fires.

While the researchers developed this new test method, manufacturers experimented with fire-resistant fabrics -- such as those used in firefighter uniforms -- and implemented them into prototypes to lower their HRR.

But how much lower would the HRR of a mattress need to be? The limit had to be low enough to ensure that burning mattresses would not spark a "flashover," wherein a fire makes a room so hot that all other combustible items in it -- chairs, clothes, etc. -- suddenly and simultaneously ignite, Gann said.

To find the limit, they measured how much heat it would take to ignite small pieces of material, each representing an item commonly found in bedrooms, such as wooden furniture or softer items like upholstery or curtains. The researchers then burned both prototype and commercially available mattresses, measuring the flow of heat to several spots around the room.

With the two data sets, the team discovered that mattresses with a peak HRR of about 600 kilowatts (kW) or more would produce enough heat to reliably ignite soft materials almost anywhere in an ordinary bedroom. While the commercial king- and twin-sized beds they tested had peak HRRs far above this value, one prototype fared much better.

"The manufacturers made some prototypes and they worked. When they sent them here, we tested them," Gann said. "Four megawatts. One megawatt. And then down to 400 kilowatts for a king-size bed. In the world of fire safety, that's a game changer."

When mattresses burn below 400 kW, the odds of flashover decrease substantially, the researchers found. To prevent mattresses from coming close to this threshold, the CPSC's standard requires that mattresses maintain an HRR under 200 kW after being ignited by the burning-bedclothes-simulating burners.

By the time the standard became effective on July 1, 2007, mattresses that met the new requirements were widely available. But did this change actually translate into lives saved? If so, how many lives was it saving? Gann was eager to know, but when he set out two years later to find the answers, he learned that there was a colossal roadblock in the way.

Mattresses remain with their original owners for 10 to 12 years on average. And after that, they are often passed on to children or get refurbished and find a new life back on the market, Gann said. This meant it would take years before enough standard-compliant mattresses found their way into homes. With so little data available at the time, Gann had to wait this one out.

Returning to the issue 10 years later, now with a wealth of information available about fire incidents (fires, injuries and deaths) from the National Fire Incident Reporting System and mattress sales from ISPA, Gann brought aboard NIST economists Stanley Gilbert and Dave Butry, who have developed statistical methods to finally put numbers to the standard's effects.

One of their approaches was to compare the total number of incidents caused by bed fires in 2005 and 2006 combined to the number in 2015 and 2016. They didn't just look at the raw values, though. If other fire-influencing factors -- like the number of homes with smoke alarms -- were not identical between the two time periods, the comparison could be unfair.

To isolate the effect of the standard from other factors, Gilbert and Butry compared the outcomes of bed fires to upholstered furniture fires, as the combustible materials in both types of fires are similar. Because the standard is exclusively about mattresses, any spike or dip that only appeared in the bed fire numbers, but not upholstered furniture fires, would probably have been driven by the standard.

The researchers crunched the numbers and were pleased to identify several strong indicators suggesting that the standard was doing its job and doing it well. They found that, relative to upholstered furniture fires, the number of bed fires from 2015 and 2016 combined was 12% lower than in 2005 and 2006. In those 10 years, injuries decreased by 34% and, much to the delight of the researchers, deaths plummeted by 82%.

Evidence mounted further in support of the standard as the researchers examined the mattress sales data alongside fire incidents.

The researchers used the sales data to create mathematical models that could estimate how many pre-standard mattresses were being replaced with new ones. The models point to the standard as the likely source of the benefits, as the mattress replacements and reductions in casualties closely mirrored each other throughout the years.

"We used several different approaches to look at the data, and they all pointed to the same conclusion; the standard saves lives," Gilbert said.

Ryan Trainer, president of ISPA, which was involved in developing and implementing 16 CFR Part 1633, also voiced appreciation that the standard has borne fruit.

"The mattress industry has collaborated with NIST and CPSC to develop a standard that is based on sound science, reflects real world risks, improves safety and is practical for manufacturers to adopt," Trainer said. "We are gratified that NIST's analysis of national fire statistics shows that since Part 1633 was implemented, the number of bed fires ignited by open-flame heat sources, and especially the deaths and injuries from those fires, have dropped so significantly."

###

Volcanic eruptions have more effect in summer

KING ABDULLAH UNIVERSITY OF SCIENCE & TECHNOLOGY (KAUST)

Research News

Detailed modeling of the effect of volcanic eruptions on the El Niño Southern Oscillation (ENSO) has shown that the climate response to these events depends on the timing of the eruption and the preceding conditions. The research, led by KAUST researchers Evgeniya Predybaylo and Georgiy Stenchikov, settles a long-standing debate about the role of volcanic eruptions in global climate perturbations.

"The ENSO is a feature of the tropical Pacific Ocean climate, with patterns of temperature, precipitation and wind that oscillate between warmer El Niño and cooler La Niña phases every two to seven years," explains Predybaylo. "Due to the vast size of the tropical Pacific, the ENSO controls the climate in many other parts of the globe and is responsible for droughts, floods, hurricanes, heat waves and other severe weather events. To evaluate these risks, it is essential to have proper projections and predictions of future ENSO behavior."

Climate modeling indicates that the ENSO is very sensitive to external perturbations, such as increased carbon dioxide in the atmosphere or volcanic eruptions. Even though major volcanic eruptions, like the Mount Pinatubo eruption in 1991, are known to have caused widespread cooling due to the reflection of solar radiation, such effects have been difficult to prove by modeling.

"There was previously no modeling consensus on how the Pacific Ocean responds to such climatologically large volcanic eruptions, with climate models predicting diverse and often contradictory responses," says Sergey Osipov from the research team.

Because the tropical Pacific climate is itself highly variable, the modeling needs to be performed carefully to separate the eruption-driven ocean response from random variations. This requires a large number of climate simulations using a model that can simulate both the radiative impact of volcanic eruptions and a realistic ENSO cycle. To achieve this, the team collaborated with Andrew Wittenberg from Princeton University, US, to run the CM2.1 climate model using KAUST's supercomputer.

"After running more than 6,000 climate simulations covering nearly 20,000 model years and analyzing the data," says Predybaylo, "we found that the ENSO response to stratospheric volcanic eruptions strongly depends on the seasonal timing of the eruption and the state of the atmosphere and ocean in the Pacific at the time."

In particular, the research showed that even very large eruptions seem to have little discernible effect on the ENSO in winter or spring, while summer eruptions almost always produce a strong climate response.

"The principles and techniques developed in our study could also be applied to various types of observational data and multimodel studies of future climate change, including the effects of global warming," says Predybaylo.


CAPTION

The Butterfly Effect: KAUST's model shows how volcanic eruptions can disrupt global climate by affecting the El Niño Southern Oscillation.


CAPTION

The team has developed a simulation of the Mount Pinatubo eruption in 1991. The blue shading represents sulfur dioxide, the white shading represents sulfate aerosols and the orange area represents volcanic ash.

CREDIT

© 2020 KAUST

Does air pollution affect mental health later in life?

WILEY

Research News

In a study of women aged 80 years and older, living in locations with higher exposures to air pollution was associated with increased depressive symptoms. The findings are published in the Journal of the American Geriatrics Society.

When looking at individual air pollutants, a team led by investigators from of the University of Southern California found that long-term exposure to nitrogen dioxide or fine particulate air pollution was associated with increased depressive symptoms, but with only a small effect. Results also suggested that depressive symptoms might play a role in linking long-term air pollution exposure to memory decline more than 10 years after the exposure.

"This is the first study showing how air pollution exposures affect depressive symptoms as well as the interrelationship between the symptoms and subsequent memory decline that had not been found in older people aged less than 80 years," said lead author Andrew Petkus, PhD.

Senior author Jiu-Chiuan Chen, MD, ScD, added, "We know late-life exposures to ambient air pollutants accelerate brain aging and increase the dementia risk, but our new findings suggest the oldest-old populations may respond to air pollution neurotoxicity in a different way that needs to be investigated further."

Does air pollution increase women's risk of dementia?

Study finds high levels associated with brain shrinkage patterns common in Alzheimer's

AMERICAN ACADEMY OF NEUROLOGY

Research News

MINNEAPOLIS - Older women who live in locations with higher levels of air pollution may have more brain shrinkage, the kind seen in Alzheimer's disease, than women who live in locations with lower levels, according to a new study published in the November 18, 2020, online issue of Neurology®, the medical journal of the American Academy of Neurology. The study looked at fine particle pollution and found that breathing in high levels of this kind of air pollution was linked to shrinkage in the areas of the brain vulnerable to Alzheimer's disease.

Fine particle pollution consists of microscopic particles of chemicals, smoke, dust and other pollutants suspended in the air. They are no larger than 2.5 micrometers, 30 times smaller than the width of a human hair.

"Smaller brain volume is a known risk factor for dementia and Alzheimer's disease, but whether air pollution alters brain structure is still being researched," said study author Diana Younan, Ph.D., of the University of Southern California in Los Angeles. "Our study found that women in their 70s and 80s who were exposed to the higher levels of air pollution had an increased risk of brain changes linked to Alzheimer's disease over five years. Our research suggests these toxins may disrupt brain structure or connections in the brain's nerve cell network, contributing to the progression toward the disease."

The study involved 712 women with an average age of 78 who did not have dementia at the start of the study. Participants provided health histories as well as information on race/ethnicity, education, employment, alcohol use, smoking and physical activity. All women received MRI brain scans at the start of the study and five years later.

Researchers used the residential addresses of each participant to determine their average exposures to air pollution in the three years before the first MRI scan. They then divided participants into four equal groups. The lowest group was exposed to an average of 7 to 10 micrograms of fine particle pollution per cubic meter of air (μg/m3). The highest group was exposed to an average of 13 to 19 μg/m3. The U.S. Environmental Pollution Agency (EPA) considers average yearly exposures up to 12 μg/m3 to be safe.

Researchers used a machine learning tool to measure signs of Alzheimer's disease in the brain, a tool that had been trained to identify patterns of brain shrinkage specific to an increased risk of Alzheimer's disease by reading the brain scans of people with the disease.

Participants' MRI brain scans at the start of the study and five years later were assigned scores based on how similar they were to Alzheimer's disease patterns identified by the machine learning tool, specifically brain changes in regions found to be vulnerable to Alzheimer's disease. Scores ranged from zero to one, with higher scores showing more brain changes. Overall, the women's scores changed from 0.28 at the start of the study to 0.44 five years later.

For each 3 μg/m3 increase in air pollution exposure levels, researchers found a broader range of scores between the two scans and an average increase of 0.03, showing a greater extent of brain shrinkage over five years, which was equivalent to a 24% increased risk of Alzheimer's disease. The increases remained the same even after adjusting for age, education, employment, cardiovascular disease, high blood pressure, physical activity and other factors that could affect brain shrinkage.

"Our findings have important public health implications, because not only did we find brain shrinkage in women exposed to higher levels of air pollution, we also found it in women exposed to air pollution levels lower than those the EPA considers safe," said Younan. "While more research is needed, federal efforts to tighten air pollution exposure standards in the future may help reduce the risk of Alzheimer's disease in our older populations."

Limitations of the study include that it only looked at the brains of older women, so results may not be the same for men or younger women. It also examined only regional fine particle pollution, not other sources of pollution such as traffic emissions. Researchers were also not able to estimate participants' exposure to fine particle pollution in middle-age and young adulthood due to nationwide data not being available for those years.

###

The study was supported by the National Institute on Aging.

Learn more about dementia at BrainandLife.org, home of the American Academy of Neurology's free patient and caregiver magazine focused on the intersection of neurologic disease and brain health. Follow Brain & Life® on Facebook, Twitter and Instagram.

When posting to social media channels about this research, we encourage you to use the hashtags #Neurology and #AANscience.

The American Academy of Neurology is the world's largest association of neurologists and neuroscience professionals, with over 36,000 members. The AAN is dedicated to promoting the highest quality patient-centered neurologic care. A neurologist is a doctor with specialized training in diagnosing, treating and managing disorders of the brain and nervous system such as Alzheimer's disease, stroke, migraine, multiple sclerosis, concussion, Parkinson's disease and epilepsy.

For more information about the American Academy of Neurology, visit AAN.com or find us on Facebook, Twitter, Instagram, LinkedIn and YouTube.


 

Which particulate air pollution poses the greatest health risk?

PAUL SCHERRER INSTITUTE

Research News

Researchers at the Paul Scherrer Institute PSI, together with colleagues from several other European institutions, have investigated whether particulate matter from certain sources can be especially harmful to human health. They found evidence that the amount of particulate matter alone is not the greatest health risk. Rather, it could be the so-called oxidative potential that makes particulate pollution so harmful. They are publishing their results today in the scientific journal Nature.

Particulate matter is one of the greatest health risks stemming from air pollution and, according to several studies, it is responsible for several million deaths each year. This means that poor air quality and particulate matter are among the five most important health risk factors, alongside high blood pressure, smoking, diabetes, and obesity. What makes particulate pollution so dangerous, however, is not yet precisely known. Together with an international collaborative team, researchers at the Paul Scherrer Institute PSI have now found out that the amount of particulate pollution is not the only decisive factor when it comes to health risks.

Oxidative potential of particulate matter as a health risk

"In this study we were primarily interested in two points", says Kaspar Dällenbach from the gas-phase and aerosol chemistry research group at PSI. "First, which sources in Europe are responsible for the so-called oxidative potential of particulate matter (also known as aerosols) and, second, whether the health risk from this particulate matter is caused by its oxidative potential."

Here the term "oxidative potential" refers to the ability of particulate matter to reduce the amount of antioxidants, which can lead to damage in cells and tissues of the human body. In a first step, the researchers exposed cells from the human airways, so-called bronchial epithelial cells, to particulate samples and tested their biological reaction. When these cells are under stress, they give off a signalling substance for the immune system, which initiates inflammatory reactions in the body. The researchers were able to show that particulate matter with an elevated oxidative potential intensifies the cells' inflammatory reaction. This suggests that the oxidative potential determines how harmful the particulate matter is. The causal connection between elevated oxidative potential and a danger to health still has not been definitely established, according to Dällenbach. "But the study is another clear indication that this connection actually does exist."

A partner study led by the University of Bern showed that the cells of patients who suffer from a special pre-existing illness, cystic fibrosis, exhibit a weakened defense against particulate matter. While in healthy cells an antioxidant defense mechanism was able to stop the progression of the inflammatory reaction, the defense capacity in sick cells was insufficient. This led to increased cell mortality.

Where does particulate matter and their oxidative potential come from?

In addition, the researchers collected particulate samples at various locations in Switzerland. Using a mass spectrometry technique developed at PSI, they analysed the composition of the particulate matter. The chemical profile obtained in this way for each particulate sample indicates the sources from which it originates. Furthermore, colleagues in Grenoble determined the oxidative potential of the same samples in order to get an indication of the danger to human health. With the help of detailed analyses and statistical methods, the researchers then determined the oxidative potential for all relevant emission sources. On the basis of these experimental data, they used a computer model to calculate the locations in Europe with the highest oxidative potential due to particulate matter throughout the year, and they identified mainly metropolitan areas such as the French capital Paris and the Po Valley in northern Italy as critical regions.

"Our results show that the oxidative potential of particulate matter and the amount of particulate matter are not determined by the same sources", Dällenbach sums up. The largest portion of particulate matter consists of mineral dust and so-called secondary inorganic aerosols, such as ammonium nitrate and sulphate. The oxidative potential of particulate matter, on the other hand, is primarily determined by so-called anthropogenic secondary organic aerosols, which come mainly from wood combustion, and by metal emissions from brake and tire wear in road traffic. The researchers found not only that the population in urban areas is exposed to a higher amount of particulate matter, but also that this particulate matter in these regions has a higher oxidative potential and is therefore more harmful to health than particulate pollution in rural areas. "Our results show that regulating the amount of particulates alone might not be effective", says Dällenbach. In addition, the study by the University of Bern suggests that population groups with pre-existing illnesses could especially benefit from appropriate measures to reduce particulate matter pollution.

###

Text: Paul Scherrer Institute/Sebastian Jutzi

About PSI

The Paul Scherrer Institute PSI develops, builds and operates large, complex research facilities and makes them available to the national and international research community. The institute's own key research priorities are in the fields of matter and materials, energy and environment and human health. PSI is committed to the training of future generations. Therefore about one quarter of our staff are post-docs, post-graduates or apprentices. Altogether PSI employs 2100 people, thus being the largest research institute in Switzerland. The annual budget amounts to approximately CHF 407 million. PSI is part of the ETH Domain, with the other members being the two Swiss Federal Institutes of Technology, ETH Zurich and EPFL Lausanne, as well as Eawag (Swiss Federal Institute of Aquatic Science and Technology), Empa (Swiss Federal Laboratories for Materials Science and Technology) and WSL (Swiss Federal Institute for Forest, Snow and Landscape Research).

Original publication

Sources of particulate matter air pollution and its oxidative potential in Europe
Kaspar Rudolf Daellenbach et al.
Nature, 18.11.2020
DOI: 10.1038/s41586-020-2902-8

Original publication of the partner study

Z. Leni et al., Oxidative stress-induced inflammation in susceptible airways by anthropogenic aerosol. PLOS ONE, 19.11.2020
DOI: 10.1371/journal.pone.0233425.