Saturday, November 26, 2022

Great potential for aquifer thermal energy storage systems

Low-temperature aquifer thermal energy storage systems enable climate-friendly heating and cooling – KIT study reveals potential for Germany

Peer-Reviewed Publication

KARLSRUHER INSTITUT FÜR TECHNOLOGIE (KIT)

Aquifer thermal energy storage systems 

IMAGE: COOLING IN SUMMER (LEFT) AND HEATING IN WINTER: AQUIFER THERMAL ENERGY STORAGE SYSTEMS, I.E. UNDERGROUND WATER-BEARING LAYERS, ARE SUITED FOR THIS PURPOSE. (GRAPHICS: RUBEN STEMMLE, AGW/KIT) view more 

CREDIT: RUBEN STEMMLE, AGW/KIT

Aquifer thermal energy storage systems can largely contribute to climate-friendly heating and cooling of buildings: Heated water is stored in the underground and pumped up, if needed. Researchers of Karlsruhe Institute of Technology (KIT) have now found that low-temperature aquifer thermal energy storage is of great potential in Germany. This potential is expected to grow in future due to climate change. The study includes the so far most detailed map of potential aquifer storage systems in Germany. The results are reported in Geothermal Energy. (DOI: 10.1186/s40517-022-00234-2

More than 30 percent of domestic energy consumption currently consumed in Germany is used for heating and cooling buildings. Decarbonization of this sector could therefore lead to major greenhouse gas emission reductions and largely contribute to climate protection. Aquifer thermal energy storage systems, i.e. water-bearing layers in the underground, are suited well for the seasonal storage and flexible use of heat and cold. Water has a high capacity of storing thermal energy. The surrounding rocks have an insulating effect. Underground aquifer thermal energy storage systems are accessed by boreholes and used to store heat from solarthermal plants or waste heat from industrial facilities. If required, the heat can be pumped up again. Such storage systems can be combined perfectly with heat networks and heat pumps. Near-surface low-temperature aquifer thermal energy storage systems (LT-ATES) have proved to be particularly efficient. As the water temperature is not much higher than the temperature of the environment, little heat is lost during storage.
 

More than Half of the German Territory Is Suited Well or Very Well

Researchers from KIT’s Institute of Applied Geosciences (AGW) and the Sustainable Geoenergy Junior Research Group have now identified the regions suited for low-temperature aquifer thermal energy storage in Germany. “Criteria for an efficient LT-ATES operation include favorable hydrogeological conditions, such as the productivity of groundwater resources and groundwater flow velocity,” Ruben Stemmle explains. The member of AGW’s Engineering Geology Group and first author of the study adds: “Moreover, energy consumption for heating and cooling must be balanced. It can be approximated by the ratio of heating and cooling degree days.”

Researchers have combined hydrogeological and climate criteria in a spatial analysis. They found that 54 percent of the German territory will be suited very well or well for LT-ATES in the upcoming decades. These potentials are largely concentrating on the North German Basin, the Upper Rhine Graben, and the South German Molasse Basin. The corresponding map was generated by the researchers with the help of a geoinformation system (GIS) and a multi-criteria decision analysis.
 

Climate Change Will Enhance the Potential of Aquifer Storage 

According to the study, the areas suited well or very well for LT-ATES will presumably increase by 13 percent for the period from 2071 to 2100. The large increase of very well-suited regions is attributed to an increasing cooling demand in the future, i.e. it will be due to climate change. However, use of aquifer storage systems is largely restricted in water protection zones, which will reduce the very well or well-suited areas by around eleven percent. “Still, our study reveals that Germany has a high potential for seasonal heat and cold storage in aquifers,” Stemmle says. (or)
 

Original Publication (Open Access)

Ruben Stemmle, Vanessa Hammer, Philipp Blum and Kathrin Menberg: Potential of low‑temperature aquifer thermal energy storage (LT‑ATES) in Germany. Geothermal Energy, 2022. DOI: 10.1186/s40517-022-00234-2 

https://geothermal-energy-journal.springeropen.com/articles/10.1186/s40517-022-00234-2
 

More about the KIT Energy Center: https://www.energie.kit.edu
 

Contact for this press release:

Dr. Martin Heidelberger, Press Officer, Phone: +49 721 608-41169, martin heidelberger∂kit edu
 

Being “The Research University in the Helmholtz Association”, KIT creates and imparts knowledge for the society and the environment. It is the objective to make significant contributions to the global challenges in the fields of energy, mobility, and information. For this, about 9,800 employees cooperate in a broad range of disciplines in natural sciences, engineering sciences, economics, and the humanities and social sciences. KIT prepares its 22,300 students for responsible tasks in society, industry, and science by offering research-based study programs. Innovation efforts at KIT build a bridge between important scientific findings and their application for the benefit of society, economic prosperity, and the preservation of our natural basis of life. KIT is one of the German universities of excellence.

This press release is available on the internet at http://www.kit.edu/kit/english/press_releases.php

High potentials for the use of low-temperature aquifer thermal energy storage (IMAGE)

KARLSRUHER INSTITUT FÜR TECHNOLOGIE (KIT)

Breaking nitrogen while generating methane

Insights into a “hot” microbe that can grow on nitrogen while producing methane

Peer-Reviewed Publication

MAX PLANCK INSTITUTE FOR MARINE MICROBIOLOGY

The cultures of microbes 

IMAGE: CLOSE VIEW OF METHANOTHERMOCOCCUS THERMOLITHOTROPHICUS CULTURE UNDER THE MICROSCOPE (LEFT) AND IN A CULTURE FLASK (MIDDLE AND RIGHT). THESE CELLS GROW ON NITROGEN GAS AS THE ONLY SOURCE OF NITROGEN. IF THERE IS NO NITROGEN GAS PRESENT, NOTHING GROWS (RIGHT). view more 

CREDIT: MAX PLANCK INSTITUTE FOR MARINE MICROBIOLOGY

Carbon and nitrogen are essential elements of life. Some organisms take up key positions for the cycling of both of them – among them Methanothermococcus thermolithotrophicus. Behind the complicated name hides a complicated microbe. M. thermolithotrophicus is a marine heat-loving methanogen. It lives in ocean sediments, from sandy coasts and salty marshes to the deep-sea, preferably at temperatures around 65 °C. It is able to turn nitrogen (N2) and carbon dioxide (CO2) into ammonia (NH3) and methane (CH4) by using hydrogen (H2). Both products, ammonia and methane, are very interesting for biotechnological applications in fertilizer and biofuels production.   

Tristan Wagner and Nevena Maslać from the Max Planck Institute for Marine Microbiology have now managed to grow this microbe in a fermenter – a challenging endeavour. “It is very complicated to provide the perfect conditions for this microbe to thrive while fixing N2 – high temperatures, no oxygen and keeping an eye on hydrogen and carbon dioxide levels”, says Maslać, who carried out the research as part of her PhD project. “But with some ingenuity and perseverance, we managed to make them thrive in our lab and reach the highest cell densities reported so far.” Once the cultures were up and running, the scientists were able to investigate the physiology of the microbe in detail, and later on deepen their study by looking how the metabolism from the microbe adapts to the N2-fixation. “In close collaboration with our colleagues Chandni Sidhu and Hanno Teeling, we combined physiological tests and differential transcriptomics, which allowed us to dig deeper into the metabolism of M. thermolithotrophicus”, Maslać explains.

As improbable as a bumblebee

The metabolic abilities of M. thermolithotrophicus are puzzling: These microbes use methanogenesis, a metabolism that originated on the early anoxic Earth, to acquire their cellular energy. Compared to humans that use oxygen to transform glucose into carbon dioxide, methanogens obtain only a very limited amount of energy from methanogenesis. Paradoxically, fixing nitrogen requires gigantic amounts of energy, which would exhaust them. “They are a bit like bumblebees, which are theoretically too heavy to fly but obviously do so, nevertheless”, says senior author Tristan Wagner, group leader of the Max Planck Research Group Microbial Metabolism. “Despite such energy limitation, these fascinating microbes have even been found to be the prime nitrogen fixers in some environments.”   

A robust nitrogenase

The enzyme that organisms use to fix nitrogen is called nitrogenase. Most common nitrogenases require Molybdenum to perform the reaction. Molybdenum nitrogenase is well-studied in bacteria living as symbionts in plant roots. Their nitrogenase can be inhibited by tungstate. Surprisingly, the Bremen scientists found that M. thermolithotrophicus is not disturbed by tungstate while growing on N2. “Our microbe was only dependent on molybdenum to fix N2 and not bothered by tungstate, which implies an adaptation of metal-acquisition systems, making it even more robust for different potential applications”, says Maslać.

Rethinking ammonia production

Nitrogen fixation, i.e., gaining nitrogen from N2, is the major process to insert nitrogen into the biological cycle. For industrial fertilizer production this process is carried out via the Haber-Bosch process, which artificially fixes nitrogen to produce ammonia with hydrogen under high temperatures and pressures. It is used to produce most of the world’s ammonia, an essential fertilizer to sustain global agriculture. The Haber-Bosch process is extremely energy-demanding: It consumes 2% of the world’s energy output, and releasing at the same time up to 1.4% of global carbon emissions. Thus, people are looking for more sustainable alternatives to produce ammonia. “The process used by M. thermolithotrophicus shows that out there in the microbial world there are still solutions that might allow for a more efficient production of ammonia, and that they can even be combined with biofuel production through methane”, says Wagner. “With this study, we understood that under N2-fixing conditions, the methanogen sacrifices its production of proteins to favor nitrogen capture, a particularly smart strategy of energy reallocation”, Wagner sums up. “Our next step will be to move into the molecular details of the process and the enzymes involved, as well as to look into other parts of the organism’s metabolism.”

Three days to help save our coastal habitats

A global gathering of marine scientists has set a three-day symposium to work out how we can maximise the many life and planet protecting services we as humans benefit from our coastal habitats

Meeting Announcement

UNIVERSITY OF PORTSMOUTH

A global gathering of marine scientists has set a three-day symposium to work out how we can maximise the many life and planet protecting services we as humans benefit from our coastal habitats.  

The loss of biodiversity and effects of climate change are impacting the health of the whole planet. Climate change, pollution, overfishing and shipping are just some of the issues that have led to a worrying decline in the health of the seas that surround us. 

On 22 November scientists will gather at an especially convened meeting in London, brought together by the University of Portsmouth and Zoological Society London to take a new seascape approach to restoring our coastal resources and habitats. 

For the first two days the experts working in restoration and research from different coastal habitats will share knowledge and expertise and to create a holistic picture of the situation, joining the dots between shared challenges and novel research findings. On the third day scientists will bring together the evidence for how habitats are connected and what benefits this brings to solving the biodiversity and climate crisis. Their findings and possible solutions will be published in an open access paper. 

Conference organiser, Dr Joanne Preston from the School of Biological Science at the University of Portsmouth, says:  “The aim is to supercharge coastal restoration across temperate regions. By bringing together these experts we hope to move current conversations away from single species restoration. By taking a systems level approach that considers all the interactions and feedbacks that occur between healthy coastal habitats, we can move towards a seascape approach to nature restoration.

“For total habitat restoration we need to work collaboratively and think about the whole system as one seascape. Only then, can we start to understand more about the connectivity between different habitats and species such as saltmarsh, oysters, seagrass and kelp – all of which are in critical decline, and find ways to restore the entire system rather than single entities.”

The event will bring experts together to unlock the evidence needed to drive a joined-up approach to habitat restoration. This is important because in marine environments everything is connected by the 3-dimensional body of seawater. The connectivity of these habitats is vital for restoring food webs and repairing degraded and fragmented individual habits.

Coastal environments have been badly damaged with nearly all oyster reefs destroyed and over half the seagrass and saltmarsh beds gone. Experts hope that by trying to understand how each environment is connected they can plan to improve resilience. As well as individual habitats recovering there will be a knock-on benefit to the entire seascape - such as increasing fish populations, nitrate removal, biodiversity, carbon drawdown, and a decrease in erosion.

The Symposium is being held in person at the Zoological Society London on 22-24 November 2022. 

More details can be found here.  


Satellites cast critical eye on coastal 

dead zones

Peer-Reviewed Publication

MICHIGAN STATE UNIVERSITY

Algae blooms begin hypoxia cycle 

IMAGE: ALGAE BLOOM ON A CHINESE BEACH IS REPRESENTATIVE OF THE EXPANSIVE ALGAE BLOOMS WHICH ULTIMATELY LEACH OXYGEN FROM THE WATER, CREATING DEAD ZONES. view more 

CREDIT: RUISHAN CHEN, MICHIGAN STATE UNIVERSITY

A dead zone in the ocean is as bad as it sounds. Being clueless about dead zones scope and path is worse. Scientists at Michigan State University (MSU) have discovered a birds-eye method to predict where, when and how long dead zones persist across large coastal regions.

“Understanding where these dead zones are and how they may change over time is the first crucial step to mitigating these critical problems. But it is not easy by using traditional methods, especially for large-scale monitoring efforts,” said Yingjie Li, who did the work while a PhD student at MSU’s Center for System Integration and Sustainability (CSIS). He currently is a postdoctoral researcher at Stanford University.Dead zones – technically known as hypoxia -- are water bodies degraded to the point where aquatic life cannot survive because of low oxygen levels. They’re a problem mainly in coastal areas where fertilizer runoff feeds algae blooms, which then die, sink to the water’s bottom and decay. That decay eats up oxygen dissolved in the water, suffocating living life, such as fish and other organisms that make up vibrant living waters.

Dead zones can be hard to identify and track and usually have been observed by water samples. But as reported in Remote Sensing of the Environment, scientists have figure out a novel way to use satellite views to understand what’s happening deep below the ocean’s surface. They used the Gulf of Mexico at the mouth of the Mississippi River as a demonstration site.

The group supplemented data from water sampling with different ways to use satellite views over time. In addition to predicting the size of hypoxic zones, the study provides additional information on where, when, and how long hypoxic zones persist with greater details and enables modeling hypoxic zones at near-real-time.

Since 1995, at least 500 coastal dead zones have been reported near coasts covering a combined area larger than the United Kingdom, endangering fisheries, recreation and the overall health of the seas. Climate change is likely to exacerbate hypoxia.

The group notes the need to initiate a global coast observatory network to synthesize and share data for better understanding, predicting and communicating the changing coasts. Currently, such data is difficulty to come by. And the stakes are higher as fertilizer applied in a field can become run off in one part of a body of water miles away. The group points out the telecoupling framework, which enables understanding human and natural interactions near and far, would be useful to see the big picture of a problem.

“Damages to our coastal waters are a telecoupling problem that spans far beyond the dead zones – distant places that apply excessive fertilizers for food production and even more distant places that demand food. Thus, it’s critical we take a holistic view while employing new methods to gain a true understanding,” said Jianguo “Jack” Liu, MSU Rachel Carson Chair in Sustainability and CSIS director.

Besides, Li and Liu, “Satellite prediction of coastal hypoxia in the northern Gulf of Mexico” was written by Drs. Samuel Robinson and Lan Nguyen from the University of Calgary. The work was funded by the National Science Foundation and the Environmental Science and Policy Summer Research Fellowship.

This map shows hypoxia recurrence measured by the percentage of time with hypoxia detected during the summertime in 2014.


CREDIT

Michigan State University Center for Systems Integration and Sustainability.

Development of the next generation of microscopes

Peer-Reviewed Publication

UIT THE ARCTIC UNIVERSITY OF NORWAY

Florian Ströhl 

IMAGE: : FLORIAN STRÖHL INITIALLY GOT THE IDEA A FEW YEARS AGO. "I BUILT THE VERY FIRST PROTOTYPE BY HAND. IT TOOK FOREVER, SINCE I HAD TO DO EVERYTHING MANUALLY. THEREFORE, I WAS VERY RELIEVED WHEN IT YIELDED RESULTS AND PROVIDED GREAT IMAGES," SAYS STRÖHL. view more 

CREDIT: KJETIL RYDLAND/UIT

To observe living cells through a microscope, a sample is usually squeezed onto a glass slide. It then lies there calmly and the cells are observable. The disadvantage is that this limits how the cells behave and it only produces two-dimensional images. 

Researchers from UiT The Arctic University of Norway and the University Hospital of North Norway (UNN) have now developed what they are referring to as the next generation microscope. The new technology can take pictures of much larger samples than before, while living and working in a more natural environment. 

A major development

The technology provides 3D images where researchers can study the smallest details from several angles, clearly and visibly, sorted into different layers and all layers are in focus.

3D microscopes do already exist, but they work slowly and give poorer results. The most common type works by recording pixel after pixel in series, which are then assembled into a 3D image. This takes time and often they can't handle more than 1-5 shots a minute. It's not very practical if what you're going to photograph something that moves. 

“With our technology, we can manage around 100 full frames per second.  And we believe it is possible to increase this number. This is just what we have demonstrated with our prototype," says Florian Ströhl, researcher at UiT. 

The new microscope is a so-called multifocus microscope, which provides completely clear images, sorted into different layers, where you can study the cells from all angles.

“It’s a big deal. The fact that we manage to get all this in one take, it is a huge development," says Ströhl. 

Can see behind objects

Ströhl explains that we are not talking about 3D in the form most of us know it. 
While in a traditional 3D image you will be able to perceive some kind of depth, with the new technology you are also be able to see behind objects.

Ströhl uses an example where you see a jungle scene in 3D at the cinema.

“In a normal 3D image, you can see that the forest has a depth, that some leaves and trees are closer than others. With the same technology used in our new 3D microscope, you are also able to see the tiger hiding behind the bushes. You are able to see and study several layers independently," says Ströhl. 

Now you do not use a microscope to look for tigers in the jungle, but for researchers this can be an important tool when looking for answers in the minutest details. 

Studying heart cells – while they beat

Ströhl has collaborated with researchers and doctors from the University Hospital of North Norway (UNN) in the development of this technology. 

Among other things, they work to understand and develop better treatment methods for various heart diseases. 

Studying a living human heart is challenging, both for technical reasons and not least for ethical reasons. Thus, researchers have used stem cells that are manipulated so that they mimic heart cells. 
In this way, they can grow organic tissue that behaves as it would in a human heart, and they can study and test this tissue to understand more about what is happening.

This tissue is almost like a small lump of live meat, about 1 cm in size. This makes for a very demanding test situation, where heart cells beat and are in constant motion along it the fact that the sample is too large to study with traditional microscopes.
The new microscope handles this well.

“You have this pumping lump of meat in a bowl, which you want to take microscope pictures of. You want to view at the very smallest parts of this, and you want super high resolution. We have achieved this with the new microscope," says Ströhl. 

Formula 1 division

Kenneth Bowitz Larsen heads a large laboratory with advanced microscopes that are used by all the research groups at the Faculty of Health at UiT. 
He has tested this new microscope, and is optimistic.

"The concept is brilliant, the microscope they have built does things that the commercial systems do not," Larsen explains. 
The laboratory he heads mainly uses commercial microscopes from suppliers such as Zeiss, Nikon, etc. 

“Then we also collaborate with research groups like the one Florian Ströhl represents. They build microscopes and test optical concepts, they are in a way like the formula 1 division of microscopy," Larsen says.
Larsen has great faith in the new microscope Ströhl has created.

The commercial microscopes must be usable for all kinds of possible samples, while the microscope Ströhl has developed is more tailored to a specific task. 

“It is very photosensitive, and it can depict the specimen in various focuses. It can work its way through the sample and you can view both high and low. And it happens so fast that it can practically be seen in real time. It's an extremely fast microscope," Larsen says.

According to Larsen, the tests so far show that this works well, and he believes this type of microscope can eventually be used on all types of samples where you look at living things that move.

He also sees another advantage with the speed of this microscope.

“Bright lights are not kind to cells. Since this microscope is so fast, it exposes the cells to much shorter illumination and is therefore more gentle," he explains.

The technology is patented

The prototype of the microscope works and is operational. The researchers are currently working on creating an upgraded version that is easier to use, so that more people are able to operate and use the microscope. 

The researchers have also applied for a patent and are also looking for industrial partners who will develop this into a microscope that will be available for sale.

In the meantime, the prototype will be made available to local partners who can benefit from the new technology.

"We will also offer it to others in Norway, if they have particularly demanding samples that they want examined," says Ströh

Florian Ströhl initially got the idea a few years ago. "I built the very first prototype by hand. It took forever, since I had to do everything manually. Therefore, I was very relieved when it yielded results and provided great images," says Ströhl.

CREDIT

Kjetil Rydland/UiT

Lung cancer screening dramatically increases long-term survival rate

Reports and Proceedings

THE MOUNT SINAI HOSPITAL / MOUNT SINAI SCHOOL OF MEDICINE

New York, NY (November 22, 2022) — Diagnosing early-stage lung cancer with low-dose computed tomography (CT) screening drastically improves the survival rate of cancer patients over a 20-year period, according to a large-scale international study being presented by Mount Sinai researchers at the annual meeting of the Radiological Society of North America.

The results show that patients diagnosed with lung cancer at an early stage via CT screening have a 20-year survival rate of 80 percent. The average five-year survival rate for all lung cancer patients is 18.6 percent because only 16 percent of lung cancers are diagnosed at an early stage. More than half of people with lung cancer die within one year of being diagnosed, making it the leading cause of cancer deaths. By the time symptoms appear, it is often too late.

The findings are the latest to demonstrate the importance of routine and early screening in detecting cancers when they are small enough to be cured by surgical removal. Unfortunately, fewer than 6 percent of the people eligible for screening get it.

“While screening doesn’t prevent cancers from occurring, it is an important tool in identifying lung cancers in their early stage when they can be surgically removed,” said the study’s lead author, Claudia Henschke, PhD, MD, Professor of Diagnostic, Molecular and Interventional Radiology and Director of the Early Lung and Cardiac Action Program at the Icahn School of Medicine at Mount Sinai in New York. “Ultimately, anyone interested in being screened needs to know that if they are unfortunate enough to develop lung cancer, it can be cured if found early.”

While treatments of more advanced cancers with targeted therapy and immunotherapy have come a long way, the best tool against lung cancer deaths is early diagnosis through low-dose CT screening before symptoms occur, according to the study authors.

“Symptoms occur mainly in late-stage lung cancer,” Dr. Henschke said. “Thus, the best way to find early-stage lung cancer is by enrolling in an annual screening program.”

The U.S. Preventive Services Task Force recommends annual lung cancer screening with low-dose CT in adults aged 50 to 80 years who have a 20 pack-year smoking history, which equals at least a pack a day for 20 years, and who currently smoke or have quit within the past 15 years.

The study tracked the 20-year survival rate of 1,285 patients who were screened in the International Early Lung Cancer Action Program (I-ELCAP) and who were later diagnosed with early-stage lung cancer. While the overall survival of the participants was 80 percent, the survival rate for the 139 participants with nonsolid cancerous lung nodules and the 155 participants with nodules that had a partly solid consistency was 100 percent. For the 991 participants with solid nodules, the survival rate was 73 percent. For participants with Stage 1A cancers that measured 10 mm or less, the 20-year survival rate was 92 percent.

Dr. Henschke and colleagues have been studying the effectiveness of cancer detection with low-dose CT screening for years. The efforts of the researchers to advance CT screening for early lung disease led to the creation of the I-ELCAP. Started in 1992, this multi-institution, multi-national research program has enrolled more than 87,000 participants from over 80 institutions.

In 2006, the researchers identified a 10-year survival rate of 80 percent for the patients whose cancer was identified by CT screening. For this study, they looked at 20-year survival rates.

“What we present here is the 20-year follow-up on participants in our screening program who were diagnosed with lung cancer and subsequently treated,” Dr. Henschke said. “The key finding is that even after this long a time interval they are not dying of their lung cancer. And even if new lung cancers were found over time, as long as they continued with annual screening, they would be OK.” 

Co-authors are David F. Yankelevitz, MD, Director of the Lung Biopsy Service at Icahn Mount Sinai; Daniel M. Libby, MD, Professor of Medicine at Weill Cornell Medical Center; James Smith, MD, Clinical Professor of Medicine at Weill Cornell Medical Center; Mark Pasmantier, MD, Weill Cornell Medical Center, and Rowena Yip, MPH, Senior Biostatistician at the I-ELCAP at Icahn Mount Sinai.  

 

About the Mount Sinai Health System

Mount Sinai Health System is one of the largest academic medical systems in the New York metro area, with more than 43,000 employees working across eight hospitals, over 400 outpatient practices, nearly 300 labs, a school of nursing, and a leading school of medicine and graduate education. Mount Sinai advances health for all people, everywhere, by taking on the most complex health care challenges of our time — discovering and applying new scientific learning and knowledge; developing safer, more effective treatments; educating the next generation of medical leaders and innovators; and supporting local communities by delivering high-quality care to all who need it.

Through the integration of its hospitals, labs, and schools, Mount Sinai offers comprehensive health care solutions from birth through geriatrics, leveraging innovative approaches such as artificial intelligence and informatics while keeping patients’ medical and emotional needs at the center of all treatment. The Health System includes approximately 7,300 primary and specialty care physicians; 13 joint-venture outpatient surgery centers throughout the five boroughs of New York City, Westchester, Long Island, and Florida; and more than 30 affiliated community health centers. We are consistently ranked by U.S. News & World Report's Best Hospitals, receiving high "Honor Roll" status, and are highly ranked: No. 1 in Geriatrics and top 20 in Cardiology/Heart Surgery, Diabetes/Endocrinology, Gastroenterology/GI Surgery, Neurology/Neurosurgery, Orthopedics, Pulmonology/Lung Surgery, Rehabilitation, and Urology. New York Eye and Ear Infirmary of Mount Sinai is ranked No. 12 in Ophthalmology. U.S. News & World Report’s “Best Children’s Hospitals” ranks Mount Sinai Kravis Children's Hospital among the country’s best in several pediatric specialties. The Icahn School of Medicine at Mount Sinai is one of three medical schools that have earned distinction by multiple indicators: It is consistently ranked in the top 20 by U.S. News & World Report's "Best Medical Schools," aligned with a U.S. News & World Report "Honor Roll" Hospital, and top 20 in the nation for National Institutes of Health funding and top 5 in the nation for numerous basic and clinical research areas. Newsweek’s “The World’s Best Smart Hospitals” ranks The Mount Sinai Hospital as No. 1 in New York and in the top five globally, and Mount Sinai Morningside in the top 20 globally.

For more information, visit https://www.mountsinai.org or find Mount Sinai on FacebookTwitter and YouTube.

Tufts University researchers find link between foods scored higher by new nutrient profiling system and better long-term health outcomes


Diets deemed better by Friedman School’s Food Compass associated with lower risk for disease and death

Peer-Reviewed Publication

TUFTS UNIVERSITY

The idea that what we eat directly affects our health is ancient; Hippocrates recognized this as far back as 400 B.C. But, identifying healthier foods in the supermarket aisle and on restaurant menus is increasingly challenging. Now, researchers at the Friedman School of Nutrition Science and Policy at Tufts have shown that a holistic food profiling system, Food Compass, identifies better overall health and lower risk for mortality.  

In a paper published in Nature Communications on November 22, researchers assessed whether adults who ate more foods with higher Food Compass scores had better long-term health outcomes and found that they did.

Introduced in 2021, Food Compass provides a holistic measure of the overall nutritional value of a food, beverage, or mixed meal. It measures 9 domains of each item, such as nutrient ratios, food-based ingredients, vitamins, minerals, extent of processing, and additives. Based on scores of 10,000 commonly consumed products in the U.S., researchers recommend foods with scores of 70 or above as foods to encourage; foods with scores of 31-69 to be eaten in moderation; and anything that scores 30 or below to be consumed sparingly. For this new study, Food Compass was used to score a person’s entire diet, based on the Food Compass scores of all the foods and beverages they regularly consume.

“A nutrient profiling system is intended to be an objective measure of how healthy a food is. If it’s achieving its purpose, then individuals who eat more foods with higher scores should have better health,” said Meghan O’Hearn, a doctoral candidate at the Friedman School and the study’s lead author.

For this validation study, researchers used nationally representative dietary records and health data from 47,999 U.S. adults aged 20-85 who were enrolled between 1999-2018 in the National Health and Nutrition Examination Survey (NHANES). Deaths were determined through linkage with the National Death Index (NDI).

Overall, researchers found that the mean Food Compass score for the diets of the nearly 50,000 subjects was only 35.5 out of 100, well below ideal. “One of the most alarming discoveries was just how poor the national average diet is,” said O’Hearn. “This is a call for actions to improve diet quality in the United States.”

When people’s Food Compass diet scores were assessed against health outcomes, multiple significant relationships were seen, even adjusting for other risk factors like age, sex, race, ethnicity, education, income, smoking, alcohol intake, physical activity, and diabetes status. A higher Food Compass diet score was associated with lower blood pressure, blood sugar, blood cholesterol, body mass index, and hemoglobin A1c levels; and lower prevalence of metabolic syndrome and cancer. A higher Food Compass diet score was also associated with lower risk of mortality: for each 10-point increase, there was a 7 percent lower risk of death from all causes.

“When searching for healthy foods and drinks, it can be a bit of a wild west,” said Dariush Mozaffarian, Jean Mayer Professor of Nutrition and dean for policy at the Friedman School. “Our findings support the validity of Food Compass as a tool to guide consumer decisions, as well as industry reformulations and public health strategies to identify and encourage healthier foods and beverages.”

Compared to existing nutrient profiling systems, Food Compass provides a more innovative and comprehensive assessment of nutritional quality, researchers say. For example, rather than measuring levels of dietary fats, sodium, or fiber in isolation, it takes a more nuanced and holistic view, evaluating the ratio of saturated to unsaturated fat; sodium to potassium; and carbohydrate to fiber.

Food Compass also boosts scores for ingredients shown to have protective effects on health, like fruits, non-starchy vegetables, beans and legumes, whole grains, nuts and seeds, seafood, yogurt, and plant oils; and lowers scores for less healthful ingredients like refined grains, red and processed meat, and ultra-processed foods and additives.

Researchers designed Food Compass with the ever-evolving field of nutrition science in mind, and their multidisciplinary team—comprised of researchers with expertise in epidemiology, medicine, economics, and biomolecular nutrition—will continue to evaluate and adapt the tool based on the most cutting-edge nutrition research.

“We know Food Compass is not perfect,” said Mozaffarian. “But, it provides a more comprehensive, holistic rating of a food’s nutritional value than existing systems, and these new findings support its validity by showing it predicts better health.”

These findings are timely given the release of the new U.S. National Strategy on Hunger, Nutrition and Health. One pillar of this strategy is to “empower all consumers to make and have access to healthy choices” through measures such as updating food labeling and making it easier to interpret, creating healthier food environments, and creating a healthier food supply.

“This study further validates Food Compass as a useful tool for defining healthy foods. We hope the Food Compass algorithm—publicly available to all—can help guide front-of-pack labeling; procurement choices in workplace, hospital, and school cafeterias; incentive programs for healthier eating in healthcare and federal nutrition programs; industry reformulations; and government policies around food,” said O’Hearn.  

Researchers plan to work on a simplified version that requires fewer nutrient inputs, as well as versions tailored to specific conditions such as diabetes and pregnancy or to other nations’ populations. The research team is also interested in adding Food Compass domains based on other aspects of foods, such as environmental sustainability, social justice, or animal welfare.

“We look forward to continuing to find ways to improve the Food Compass system, and to get it to more users to help clear up confusion about healthier choices,” said Mozaffarian.

Research reported in this article was supported by the National Institutes of Health’s National Heart, Lung, and Blood Institute under award number 2R01HL115189 and Vail Innovative Global Research. Complete information on authors, funders, and conflicts of interest is available in the published paper.

The content is solely the responsibility of the authors and does not necessarily represent the official views of the funders.