Sunday, November 05, 2023

 

Damaging thunderstorm winds increasing in central U.S.


Analysis shows impact of climate change on outflow from thunderstorms


Peer-Reviewed Publication

NATIONAL CENTER FOR ATMOSPHERIC RESEARCH/UNIVERSITY CORPORATION FOR ATMOSPHERIC RESEARCH




Destructive winds that flow out of thunderstorms in the central United States are becoming more widespread with warming temperatures, according to new research by the U.S. National Science Foundation (NSF) National Center for Atmospheric Research (NCAR).

The new study, published this week in Nature Climate Change, shows that the central U.S. experienced a fivefold increase in the geographic area affected by damaging thunderstorm straight line winds in the past 40 years. The research uses a combination of meteorological observations, very high-resolution computer modeling, and analyses of fundamental physical laws to estimate the changes in the winds, which are so short-lived and localized that they often are not picked up by weather stations.

The work was funded by NSF, which is NCAR’s sponsor, and by the MIT Climate Grand Challenge on Weather and Climate Extremes.

“Thunderstorms are causing more and more of these extreme wind events,” said NCAR scientist Andreas Prein, the author of the new study. “These gusts that suddenly go from no wind at all to gusts of 60 to 80 miles per hour can have very damaging impacts on buildings, power grids, and even human safety.”
 
Capturing small-scale events
 
Straight line winds are caused by powerful downdrafts that flow from the base of thunderstorms. The National Weather Service classifies such winds as damaging if they exceed 50 knots, or about 57 miles per hour. The winds likely cause about $2.5 billion in damage annually in the US, based on insurance industry estimates. In 2020, a particularly powerful derecho — a widespread, straight-line windstorm associated with fast-moving thunderstorms — caused an estimated $11 billion in damage in the Midwest.

Scientists have long been interested in the impact of climate change on straight line winds. Until now, however, simulations of climate conditions run on computer models have been too coarse to capture such brief and small-scale events. Further clouding the picture, weather observations appear to show that there are more periods of little to no wind worldwide (a phenomenon known as global stilling), even though, paradoxically, maximum wind speeds can rise simultaneously. 

To determine if damaging straight line winds are becoming more widespread, Prein turned to a high-resolution, computer model simulation that NCAR scientists recently produced in collaboration with the U.S. Geological Survey. The advanced simulation is named CONUS404 because it simulates climate and hydrological conditions at a resolution of 4 kilometers (2.5 miles) across the continental United States, or CONUS, over the past 40-plus years.

Prein focused on summertime conditions in the central U.S., a global hotspot for straight line winds. The high-resolution modeling enabled him to get a much more fine-grained picture of winds than relying on sparse atmospheric observations, and to expand his analysis from 95 weather stations to 109,387 points in the simulation. The simulation showed that the area affected by straight line winds has increased in the last 40 years by about 4.8 times.

Prein verified the accuracy of the simulation by comparing it with measurements of selected winds in the past, such as the 2020 derecho. His analysis showed that the CONUS404 simulations were reliably capturing straight-line winds, as opposed to previous, coarser simulations that failed to capture many such events.

This left the question of whether climate change could be responsible for the increase in winds. Prein approached this question by analyzing the thermodynamics of straight line winds and how actual wind events such as the 2020 derecho would have been affected by different atmospheric conditions based on first-order physical principles.

Straight line winds result when rain and hail at high altitudes evaporate and cool the ambient air, which then plummets and, at the surface, spawns intense winds that rush outward. In studying this process, Prein’s calculations showed that climate change is likely altering the picture by increasing the temperature difference between the cool air in downdrafts and the warm surrounding air. This larger temperature difference lets the cold air descend even faster, making it more likely for a thunderstorm to generate damaging winds.

“As these findings show, it is crucial to incorporate the increasing risk of straight line winds when planning for the impacts of climate change so we can ensure the future resiliency of infrastructure to this frequently neglected peril,” Prein said.

This material is based upon work supported by the National Center for Atmospheric Research, a major facility sponsored by the National Science Foundation and managed by the University Corporation for Atmospheric Research. Any opinions, findings and conclusions or recommendations expressed in this material do not necessarily reflect the views of the National Science Foundation.

About the article

Title: “Thunderstorm straight line winds intensify with climate change”
Author: Andreas F. Prein
Journal: Nature Climate Change

On the web: news.ucar.edu
On X: @NCAR_Science


 

 

Making gluten-free, sorghum-based beers easier to brew and enjoy


Peer-Reviewed Publication

AMERICAN CHEMICAL SOCIETY




Though beer is a popular drink worldwide, it’s usually made from barley, which leaves those with a gluten allergy or intolerance unable to enjoy the frothy beverage. Sorghum, a naturally gluten-free grain, could be an alternative, but complex preparation steps have hampered its widespread adoption by brewers. Now, researchers reporting the molecular basis behind sorghum brewing in ACS’ Journal of Proteome Research have uncovered an enzyme that could improve the future of sorghum-based beers.

Traditionally, beer brewers start with barley grains, which they malt, mash, boil and ferment to create the bubbly beverage. Barley contains gluten — a group of proteins found in several cereal grains. Sorghum, on the other hand, lacks these proteins and behaves differently than barley during brewing. For example, strong molecular bonds make it difficult to release starches from the grains during the mash stage. And fewer enzymes are present in sorghum wort — the liquid extracted from the mashing process — to transform the starches into simple sugars, such as glucose, which ferments into alcohol. Even when brewers adjust the reaction conditions during these steps, the resulting beverages are still less alcoholic than barley-based beers. To help bring the alcohol content up to expected standards, Edward Kerr, Glen Fox and Benjamin Schulz investigated the molecular processes that occur during sorghum brewing and found ways to improve the final product.

The team brewed both barley and sorghum beverages, taking them through malting, mashing and fermentation steps at varying temperatures and lengths of time. At the malting stage, the samples were analyzed via mass spectrometry proteomics, which revealed the presence of many of the same enzymes in barley malt and sorghum malt; those enzymes included amylases that break down starches into maltose. After the malts were steeped with water, the resulting sorghum wort contained less maltose than barley wort, but considerably more glucose. The team attributed these differences to the enzyme compositions: Sorghum wort contains fewer amylase enzymes than barley wort but more α‑glucosidase, an enzyme that breaks down starches into glucose instead. By optimizing brewing parameters to favor the activity of α‑glucosidase, the researchers say that brewers could create sorghum wort with higher concentrations of fermentable glucose, resulting in sorghum-based beers with higher alcohol content and better overall quality.

The authors do not acknowledge a funding source for this study.

 

The American Chemical Society (ACS) is a nonprofit organization chartered by the U.S. Congress. ACS’ mission is to advance the broader chemistry enterprise and its practitioners for the benefit of Earth and all its people. The Society is a global leader in promoting excellence in science education and providing access to chemistry-related information and research through its multiple research solutions, peer-reviewed journals, scientific conferences, eBooks and weekly news periodical Chemical & Engineering News. ACS journals are among the most cited, most trusted and most read within the scientific literature; however, ACS itself does not conduct chemical research. As a leader in scientific information solutions, its CAS division partners with global innovators to accelerate breakthroughs by curating, connecting and analyzing the world’s scientific knowledge. ACS’ main offices are in Washington, D.C., and Columbus, Ohio.

To automatically receive news releases from the American Chemical Society, contact newsroom@acs.org.

Follow us: Twitter | Facebook | LinkedIn | Instagram

 

Plant populations in Cologne are adapted to their urban environments


Peer-Reviewed Publication

UNIVERSITY OF COLOGNE

Arabidopsis thaliana on a wall in Cologne 

IMAGE: 

THALE CRESS (ARABIDOPSIS THALIANA) POPULATING THE STREETS OF COLOGNE VARIES GREATLY IN ITS GENETIC MAKE-UP, ALLOWING THE PLANTS TO ADAPT THEIR REPRODUCTION TO LOCAL ENVIRONMENTAL CONDITIONS SUCH AS TEMPERATURE AND HUMAN DISTURBANCES.

view more 

CREDIT: JUSTINE FLORET





A research team from the Universities of Cologne and Potsdam and the Max Planck Institute for Plant Breeding Research has found that the regional lines of the thale cress (Arabidopsis thaliana), a small ruderal plant which populates the streets of Cologne, vary greatly in typical life cycle characteristics, such as the regulation of flowering and germination. This allows them to adapt their reproduction to local environmental conditions such as temperature and human disturbances. The researchers from Collaborative Research Center / Transregio 341 “Plant Ecological Genetics” found that environmental conditions filter out unsuited lines from a pool of regionally diverse plant lines, so that only the ones with suitable traits manage to survive. “This process, termed “environmental filtering”, is well-known for driving the establishment or persistence of plant species in a particular location. It is fascinating to see that exactly the same process also works for different lines within a species” says Anja Linstädter, who has recently left Cologne to become Professor in Biodiversity Research at the University of Potsdam. The study is reported in the article “Environmental filtering of life-history trait diversity in urban populations of Arabidopsis thaliana” published in the Journal of Ecology.

Arabidopsis thaliana is the most common model organism in plant research, and therefore important for plant biology. Most of the research work on A. thaliana focuses on the descendants of a single individual, a line called Col-0, while the naturally growing plants in the region of Cologne show a multiplicity of lines. “In our research consortium, we work on understanding how findings made in the lab manifest in nature”, explained Professor Dr Juliette de Meaux, spokesperson of CRC TRR 341. “At the beginning, what we asked ourselves is: does the lab line Col-0 resemble lines present naturally in Cologne? But then, we began to realize how much ecological diversity there is in our streets.”

The plants analysed in this study were collected by Dr Gregor Schmitz, the first author of the study, along his way to work. He noticed that A. thaliana was growing naturally in places with very different environmental conditions. These included patches with little water and nutrient supply such as small pavement cracks, but also highly disturbed habitats. When the scientists sequenced the plant genomes, they found that the urban lines were not more related to each other than to lines from a larger region.

To their surprise, the biologists found that there are large differences among A. thaliana populations in Cologne in respect to their life-cycle traits. These differences contribute to their persistence in habitats that mainly differ in how much they are disturbed by human activities such as weeding or mowing. “In other words, the genetic diversity we find throughout the city is not distributed at random, but matches specific differences in the urban environments,” said Schmitz.

Most plants use the cold to regulate the timing of flowering. In this way, they ensure that flowering does not take place in the middle of winter. In the streets of Cologne, the scientists found A. thaliana lines that use cold to regulate flowering, but also lines that do not use it: they flower very quickly after germinating. Also, the team discovered some lines with seeds that become dormant if they are exposed to high temperatures for a few days alongside lines whose seeds do not become dormant when it is hot. “The different lines can thus display very different life cycles,” said de Meaux. “Some are very fast, they need no dormancy and have no requirement for cold before flowering and others are slower, they have a high capacity to induce dormancy and cold is a requirement to flower. Such diversity across such a small area came as a surprise, but the most admirable was to see that it covaried with the gradient of environmental disturbance in our streets.”

The scientists will be further investigating how environmental heterogeneity selects specific genetic variants of urban Arabidopsis thaliana plants in Cologne.

 

How “blue” and “green” appeared in a language that didn’t have words for them


People of a remote Amazonian society who learned Spanish as a second language began to interpret colors in a new way, an MIT study has found


Peer-Reviewed Publication

MASSACHUSETTS INSTITUTE OF TECHNOLOGY




CAMBRIDGE, MA -- The human eye can perceive about 1 million colors, but languages have far fewer words to describe those colors. So-called basic color terms, single color words used frequently by speakers of a given language, are often employed to gauge how languages differ in their handling of color. Languages spoken in industrialized nations such as the United States, for example, tend to have about a dozen basic color terms, while languages spoken by more isolated populations often have fewer.

However, the way that a language divides up color space can be influenced by contact with other languages, according to a new study from MIT.

Among members of the Tsimane’ society, who live in a remote part of the Bolivian Amazon rainforest, the researchers found that those who had learned Spanish as a second language began to classify colors into more words, making color distinctions that are not commonly used by Tsimane’ who are monolingual.

In the most striking finding, Tsimane’ who were bilingual began using two different words to describe blue and green, which monolingual Tsimane’ speakers do not typically do. And, instead of borrowing Spanish words for blue and green, they repurposed words from their own language to describe those colors.

“Learning a second language enables you to understand these concepts that you didn’t have in your first language,” says Edward Gibson, an MIT professor of brain and cognitive sciences and the senior author of the study. “What’s also interesting is they used their own Tsimane’ terms to start dividing up the color space more like Spanish does.”

The researchers also found that the bilingual Tsimane’ became more precise in describing colors such as yellow and red, which monolingual speakers tend to use to encompass many shades beyond what a Spanish or English speaker would include.

“It’s a great example of one of the main benefits of learning a second language, which is that you open a different worldview and different concepts that then you can import to your native language,” says Saima Malik-Moraleda, a graduate student in the Speech and Hearing Bioscience and Technology Program at Harvard University and the lead author of the study.

Kyle Mahowald, an assistant professor of linguistics at the University of Texas at Austin, and Bevil Conway, a senior investigator at the National Eye Institute, are also authors of the paper, which appears this week in Psychological Science.

Dividing up the color space

In English and many other languages of industrialized nations, there are basic color words corresponding to black, white, red, orange, yellow, green, blue, purple, brown, pink, and gray. South American Spanish additionally divides the blue space into light blue (“celeste”) and dark blue (“azul”).

Members of Tsimane’ society consistently use only three color words, which correspond to black, white, and red. There are also a handful of words that encompass many shades of yellow or brown, as well as two words that are used interchangeably to mean either green or blue. However, these words are not used by everyone in the population.

Several years ago, Gibson and others reported that in a study of more than 100 languages, including Tsimane’, speakers tend to divide the “warm” part of the color spectrum into more color words than the “cooler” regions, which include blue and green. In the Tsimane’ language, two words, “shandyes” and “yushñus,” are used interchangeably for any hue that falls within blue or green.

As a follow-up to that study, Malik-Moraleda wanted to explore whether learning a second language would have any effect on how the Tsimane’ use color words. Today, many Tsimane’ learn Bolivian Spanish as a second language.

Working with monolingual and bilingual members of the Tsimane’, the researchers asked people to perform two different tasks. For the bilingual population, they asked them to do the tasks twice, once in Tsimane’ and once in Spanish.

In the first task, the researchers showed the subjects 84 chips of different colors, one by one, and asked them what word they would use to describe the color. In the second task, the subjects were shown the entire set of chips and asked to group the chips by color word.

The researchers found that when performing this task in Spanish, the bilingual Tsimane’ classified colors into the traditional color words of the Spanish language. Additionally, the bilingual speakers were much more precise about naming colors when they were performed the task in their native language.

“Remarkably, the bilinguals really divide up the space much more than the monolinguals, in spite of the fact that they’re still primarily Tsimane’ speakers,” Gibson says.

Strikingly, the bilingual Tsimane’ also began using separate words for blue and green, even though their native language does not distinguish those colors. Bilingual Tsimane’ speakers began to use “yushñus” exclusively to describe blue, and “shandyes” exclusively to describe green.

Borrowing concepts

The findings suggest that contact between languages can influence how people think about concepts such as color, the researchers say.

“It does seem like the concepts are being borrowed from Spanish,” Gibson says. “The bilingual speakers learn a different way to divide up the color space, which is pretty useful if you’re dealing with the industrialized world. It’s useful to be able to label colors that way, and somehow they import some of that into the Tsimane’ meaning space.”

While the researchers observed that the distinctions between blue and green appeared only in Tsimane’ who had learned Spanish, they say it’s possible that this usage could spread within the population so that monolingual Tsimane’ also start to use it. Another possibility, which they believe is more likely, is that more of the population will become bilingual, as they have more contact with the Spanish-speaking villages nearby.

“Over time, these populations tend to learn whatever the dominant outside language is because it’s valuable for getting jobs where you earn money,” Gibson says.

The researchers now hope to study whether other concepts, such as frames of reference for time, may spread from Spanish to Tsimane’ speakers who become bilingual. Malik-Moraleda also hopes to see if the color language findings from this study could be replicated in other remote populations, specifically, in the Gujjar, a nomadic community living in the Himalayan mountains in Kashmir.

###

The research was funded by a La Caixa Fellowship, the Dingwall Foundation, the Intramural Research Program of the National Eye Institute, and the National Science Foundation CompCog Program.

 

AS ABOVE,SO BELOW

Exploding stars


Search for witnesses of near-Earth astrophysical events

Peer-Reviewed Publication

HELMHOLTZ-ZENTRUM DRESDEN-ROSSENDORF

Prof. Anton Wallner, Head of the HZDR Department "Accelerator Mass Spectometry and Isotope Research", at the AMS facility of the Australian National University (ANU) in Canberra. 

IMAGE: 

HZDR PHYSICIST PROF. ANTON WALLNER IS A SPECIALIST IN THE SEARCH FOR INTERSTELLAR MATTER USING ACCELERATOR MASS SPECTROMETRY (AMS). WALLNER AND HIS COLLEAGUES IN AUSTRALIA ARE CURRENTLY ON THE HUNT FOR FURTHER COSMIC ISOTOPES - IN CANBERRA HE IS SEARCHING FOR FE-60 ATOMS, IN SYDNEY FOR PU-244 ATOMS. TO THIS END, HE HAS RECEIVED A NUMBER OF LUNAR SAMPLES FROM THE US SPACE AGENCY NASA.

view more 

CREDIT: ANU




When massive stars or other stellar objects explode in the Earth's cosmic neighborhood, ejected debris can also reach our solar system. Traces of such events are found on Earth or the Moon and can be detected using accelerator mass spectrometry, or AMS for short. An overview of this exciting research is provided in the scientific journal Annual Review of Nuclear and Particle Science (DOI: 10.1146/annurev-nucl-011823-045541) by Prof. Anton Wallner of the Helmholtz-Zentrum Dresden-Rossendorf (HZDR), who soon plans to decisively advance this promising branch of research with the new, ultrasensitive AMS facility „HAMSTER."

In their paper, HZDR physicist Anton Wallner and colleague Prof. Brian D. Fields from the University of Illinois in Urbana, USA, provide an overview of near-Earth cosmic explosions with a particular focus on events that occurred three and, respectively, seven million years ago.

"Fortunately, these events were still far enough away, so they probably did not significantly impact the Earth's climate or have major effects on the biosphere. However, things get really uncomfortable when cosmic explosions occur at a distance of 30 light-years or less," Wallner explains. Converted into the astrophysical unit parsec, this corresponds to less than eight to ten parsecs.

Once massive stars have burned up all their fuel, their cores collapse into an ultra-dense neutron star or a black hole, while at the same time, hot gas is ejected outward at a high velocity. A large part of the gas and dust finely dispersed between the stars is carried away by an expanding shock wave. Like a giant balloon with bumps and dents, this envelope also sweeps up any material already present in space. After many thousands of years, the remnants of a supernova have expanded to a diameter of several 10 parsecs, spreading out ever more slowly until the motion finally ceases.

A nearby explosion has the potential to severely disrupt the Earth's biosphere and cause a mass extinction similar to the asteroid impact 66 million years ago. The dinosaurs and many other animal species fell victim to that event. "If we consider the time period since the solar system's formation, which spans billions of years, very close cosmic explosions cannot be ruled out," Wallner emphasizes.

Nevertheless, supernovae only occur in very heavy stars with more than eight to ten times the mass of our sun. Such stars are rare. One of the closest candidates of this size is the red supergiant Betelgeuse in the constellation of Orion, located at a safe distance of about 150 parsecs from our solar system.

Production of interstellar isotopes

Many new atoms are generated during cosmic explosions or shortly before and during the supernova – among them also a number of radioactive atoms. Wallner is particularly interested in the radioactive iron isotope with the atomic mass of 60. About half of these isotopes, called iron-60 for short, have turned into a stable nickel isotope after 2.6 million years. Therefore, all iron-60 that was present at the Earth's formation some 4,500 million years ago has long since disappeared.

"Iron-60 is extremely rare on Earth because, by natural means, it is not produced in any significant amount. However, it is produced in large quantities just before a supernova takes place. If this isotope now turns up in sediments from the ocean floor or in material from the surface of the moon, it probably came from a supernova or another similar process in space that has taken place near Earth only a few million years ago," Wallner summarizes.

The same applies to the plutonium isotope with the atomic mass of 244. However, this plutonium-244 is more likely generated by the collision of neutron stars than by supernovae. Thus, it is an indicator of the nucleosynthesis of heavy elements. After a period of 80 million years, about half of the plutonium-244 isotope has turned into other elements. Therefore, the slowly decaying plutonium-244 is, in addition to iron-60, another indicator of galactic events and the production of new elements in the last millions of years.

"Exactly how often, where, and under what conditions these heavy elements are produced is currently the subject of intense scientific debate. Plutonium-244 also requires explosive events and, according to theory, is produced similarly to the elements gold or platinum, which have always occurred naturally on Earth but consist of stable atoms today," Wallner explains.

Dust particles as cosmic cargo vessels

But how do these isotopes get to Earth in the first place? The iron-60 atoms ejected by the supernova like to congregate in dust particles. So do the plutonium-244 isotopes, which were possibly created in other events and swept up by the supernova's expanding envelope. After cosmic explosions at a distance of more than ten but less than 150 parsecs, according to theory, the solar wind and the magnetic field of the heliosphere prevent individual atoms from reaching the Earth. However, the iron-60 and plutonium-244 atoms trapped in dust particles continue to fly toward the Earth and the Moon, where they can eventually trickle down to the surface.

Even with a supernova occurring within the so-called "kill radius" of less than ten parsecs, not even a microgram of matter from the envelope will land on each square centimeter. In fact, only very few iron-60 atoms per square centimeter reach the Earth each year. This poses an enormous challenge to "investigators" like physicist Anton Wallner: Within a one-gram sediment sample, perhaps a few thousand iron-60 atoms are distributed like needles in a haystack among billions times billions of the ubiquitous and stable iron atoms with the atomic mass of 56. On top of that, even the most sensitive measurement method may only detect every five thousandth particle, i.e., a maximum of only a few iron-60 atoms in a typical measurement sample.

Such extremely low concentrations can only be determined with Accelerator Mass Spectrometry, short AMS. One of these facilities, the Dresden AMS (DREAMS), is located at the HZDR, soon to be joined by the Helmholtz Accelerator Mass Spectrometer Tracing Environmental Radionuclides (HAMSTER). Since AMS facilities around the globe are designed differently, the various facilities can complement each other in the search for rare isotopes from supernova explosions.

20 years for just one thousand iron-60 atoms

Isotopes of the same element but with a different mass, like the naturally occurring iron-56, are removed with mass filters. Atoms of other elements with the same mass as the target object iron-60, for example, the naturally occurring nickel-60, also interfere. Even after very complex chemical preparation of the samples, they are still billions of times more abundant than iron-60 and must be separated in a special accelerator facility using nuclear physics methods.

In the end, perhaps five individual iron-60 atoms are identified in a measuring process that lasts several hours. Pioneering work on iron-60 detection was conducted at TU Munich. Presently, however, Canberra at the Australian National University is the only existing facility worldwide that is sensitive enough to perform such measurements.

In total, only about one thousand iron-60 atoms have been measured in the past 20 years. For the interstellar plutonium-244, which occurs in concentrations more than 10,000 times lower, only data for individual atoms were available for a long time. Only recently has it been possible to determine about a hundred plutonium-244 atoms at a specialized infrastructure in Sydney - similar to the HAMSTER facility currently under development at the HZDR.

However, only certain samples are suitable for investigation, which act as archives to preserve these atoms coming from space for millions of years. Samples from the Earth's surface, for example, are rapidly "diluted" by geological processes. Sediments and crusts from the deep sea, which slowly form undisturbed on the ocean floor, are ideal. Alternatively, samples from the lunar surface are suitable because disruptive processes are hardly a problem.

On a research trip until the beginning of November 2023, Wallner and his colleagues will hunt for further cosmic isotopes at particularly suitable AMS facilities in the Australian cities of Canberra (iron-60) and Sydney (plutonium-244). For this purpose, he has received a number of lunar samples from the U.S. space agency NASA.

"Parallel measurements are also taking place at HZDR. These unique samples will allow us to gain new insights into supernova explosions near Earth, but also into the heaviest elements in our galaxy which are formed through these and other processes," Wallner is certain.


Ferromanganese crust from the Pacific Ocean

 

Practicing mindfulness can help people make heart-healthy eating choices


A study led by Brown University researchers found that participants in a mindfulness-based blood pressure reduction program improved health behaviors that lower blood pressure


Peer-Reviewed Publication

BROWN UNIVERSITY




PROVIDENCE, R.I. [Brown University] — Practicing mindfulness focused on healthy eating can be good for the heart, a new study shows, because it improves self-awareness and helps people stick to a heart-healthy diet.

When people who had elevated blood pressure participated in an eight-week mindfulness-based blood pressure reduction program for the study, they significantly improved their scores on measures of self-awareness and adherence to a heart-healthy diet compared to a control group. The results were published in JAMA Network Open.

“Participants in the program showed significant improvement in adherence to a heart-healthy diet, which is one of the biggest drivers of blood pressure, as well as significant improvements in self-awareness, which appears to influence healthy eating habits,” said lead study author Eric B. Loucks, an associate professor of epidemiology, behavioral and social sciences, and director of the Mindfulness Center at Brown University.

Loucks said the study helps explain the mechanism by which a customized mindfulness training program adapted toward improving diet can affect blood pressure.

“Improvements in our self-awareness, of how different foods make us feel, of how our body feels in general, as well as our thoughts, emotions and physical sensations around eating healthy as well as unhealthy food, can influence people’s dietary choices,” he said.

High blood pressure, a major cause of cardiovascular disease, is the single most important risk factor for early death worldwide, according to a recent report by the World Health Organization, leading to an estimated 10.8 million avoidable deaths every year. The important thing to note about those avoidable deaths, Loucks said, is that there is ample research supporting effective strategies to control and prevent hypertension.

“Almost everyone has the power to control blood pressure through changes in diet and physical activity, adherence to antihypertensive medications, minimizing alcohol intake and monitoring stress reactivity,” he said.

A heart-focused mindfulness program

The mindfulness-based blood pressure reduction program used in the study, which Loucks developed in 2014, trains participants in skills such as meditation, yoga, self-awareness, attention control and emotion regulation. What makes the program unique, he said, is that participants learn how to direct those skills toward behaviors known to lower blood pressure.

The MB-BP plan consisted of a group orientation session, eight 2.5-hour weekly group sessions and one day-long retreat, as well as recommended home practice for 45 minutes, six days a week. The program was led by trained instructors with expertise in cardiovascular disease etiology, treatment and prevention. Classes were held in Providence, R.I., at Brown University and at a health center in a lower-income, urban neighborhood.

The study compared two groups, totaling 201 participants. The 101 people in the test group were a part of the 8-week MB-BP program, which included personalized feedback and education about hypertension risk factors; mindfulness training of participants in relationship to hypertension risk factors (including mindful eating); and behavior change supportThe “usual care” control group received educational brochures on controlling high blood pressure. Both groups received a home blood-pressure monitoring device with usage training, and options for referral to primary care physicians.

The researchers focused on participant adherence to the DASH (Dietary Approaches to Stop Hypertension) program, a balanced eating plan rich in fruits, vegetables, whole grains and low-fat dairy, intended to create a heart-healthy eating style for life. Despite its effectiveness, adherence to the DASH diet is typically low.

After six months, the mindfulness group showed a 0.34-point improvement in the DASH diet score. Loucks explained that this effect can be interpreted as equivalent for a participant shifting from a vegetable intake approaching recommended levels (2-3 servings) to recommended levels (at least 4 servings), or making similar shifts across another component of the DASH score. The control group showed a -0.04-point change in DASH diet score.

The mindfulness group also showed a 0.71-point improvement in the average interoceptive awareness (which is the process of sensing and interpreting signals from one's own body) score compared to six months prior, which outperformed the control group by a significant 0.54 points.

The authors said the trial results offer evidence that an adapted mindfulness training program for participants with high blood pressure that targets diet and self-awareness significantly improves both.

“The program gives participants the tools to make heart-healthy diet changes that can lower their blood pressure and decrease their risk of cardiovascular disease,” Loucks said.

The researchers are studying different “doses” of the program (for example, shorter program lengths, fewer sessions), as well as factors influencing the implementation of the MB-BP plan in a real-world setting — including eligibility for health insurance coverage, accessibility for different patient groups and flexibility for physicians.

Additional contributors from Brown University included Frances Saadeh, Matthew Scarpaci, Jeffrey Proulx, Roee Gutman and Willoughby Britton. The study was supported by the National Institutes of Health Science of Behavior Change Common Fund Program through an award administered by the National Center for Complementary and Integrative Health (UH2AT009145, UH3AT009145).