Monday, October 24, 2022

Liver cancer cases and deaths projected to rise by more than 55% by 2040


To avoid this increase, countries must achieve at least a 3% annual decrease in liver cancer incidence and mortality rates according to a new report published in the Journal of Hepatology


Peer-Reviewed Publication

ELSEVIER

Global burden of primary liver cancer 

IMAGE: MAIN FINDINGS ON THE GLOBAL BURDEN OF LIVER CANCER IN 2020 AND PREDICTIONS TO 2040. view more 

CREDIT: JOURNAL OF HEPATOLOGY

Amsterdam, October 6, 2022 – A new analysis reveals that primary liver cancer was among the top three causes of cancer death in 46 countries in 2020 and the number of people diagnosed with or dying from primary liver cancer per year could rise by more than 55% by 2040. Investigators call for efforts to control the disease to be prioritized in a new study in the Journal of Hepatology, published by Elsevier.

“Liver cancer causes a huge burden of disease globally each year,” commented senior author Isabelle Soerjomataram, MD, PhD, International Agency for Research on Cancer (IARC/WHO), Cancer Surveillance Branch, Lyon, France. “It is also largely preventable if control efforts are prioritized — major risk factors include hepatitis B virus, hepatitis C virus, alcohol consumption, excess body weight, and metabolic conditions including type 2 diabetes.”

“In light of the availability of new and improved global cancer incidence and mortality estimates, we wanted to provide the most up-to-date assessment of the burden of liver cancer and develop an essential tool for national liver cancer control planning,” explained lead author Harriet Rumgay, PhD candidate, International Agency for Research on Cancer (IARC/WHO), Cancer Surveillance Branch, Lyon, France. “In this analysis we describe where liver cancer ranks among all cancer types for cancer diagnoses and deaths in nations across the world. We also present predictions of the future liver cancer burden to 2040.”

Investigators extracted data on primary liver cancer cases and deaths from the International Agency for Research on Cancer’s GLOBOCAN 2020 database, which produces cancer incidence and mortality estimates for 36 cancer types in 185 countries worldwide. The predicted change in the number of cancer cases or deaths by the year 2040 was estimated using population projections produced by the United Nations.

Results showed that in 2020, an estimated 905,700 individuals were diagnosed with liver cancer and 830,200 died from liver cancer globally. According to these data, liver cancer is now among the top three causes of cancer death in 46 countries and is among the top five causes of cancer death in nearly 100 countries including several high-income countries.

Liver cancer incidence and mortality rates were highest in Eastern Asia, Northern Africa, and South-Eastern Asia. Investigators predict the annual number of new cases and deaths from liver cancer will rise by more than 55% over the next 20 years, assuming current rates do not change. The predicted rise in cases will increase the need for resources to manage care of liver cancer patients.

The researchers were alarmed to find that the number of cases and deaths from liver cancer will continue to increase year on year. They caution that in order to avoid this rise in cases and deaths, countries across the world must achieve at least a 3% annual decrease in liver cancer incidence and mortality rates through preventive measures.

These estimates provide a snapshot of the global burden of liver cancer and demonstrate the importance of improving and reinforcing liver cancer prevention measures.

“We are at a turning point in liver cancer prevention as successes in hepatitis B virus and hepatitis C virus control efforts will be reflected in rates of liver cancer in the next few decades,” noted Dr. Soerjomataram. “These efforts must be sustained and reinforced especially considering the disruption caused by the COVID-19 pandemic on certain hepatitis B and C virus control efforts."

The authors call for public health officials to prepare for the predicted increase in demand for resources to manage the care of liver cancer patients throughout the cancer pathway, including improved access to palliative care due to the predicted growing number of liver cancer patients, and to reinforce current liver cancer prevention measures such as immunization, testing, and treatment for hepatitis B virus; population-wide testing and treatment for hepatitis C virus infection; reduction of population alcohol consumption; and curbing the rise in diabetes and obesity prevalence.

“The number of people diagnosed with or dying from liver cancer per year could increase by nearly 500,000 cases or deaths by 2040 unless we achieve a substantial decrease in liver cancer rates through primary prevention,” concluded Dr. Soerjomataram.

 

New research identifies lack of appropriate control tools for many major infectious diseases of animals

New research published in The Lancet Planetary Health has identified a lack of appropriate control tools for many infectious diseases of animals that can have a significant impact upon the UN Sustainable Development Goals (SDGs).

Peer-Reviewed Publication

CABI

New research published in The Lancet Planetary Health has identified a lack of appropriate control tools for many infectious diseases of animals that can have a significant impact upon the UN Sustainable Development Goals (SDGs).

International efforts should focus on developing control tools for a range of priority infectious diseases of animals, including Nipah virus infection, African swine fever, foot and mouth disease and bovine tuberculosis, scientists say, but progress is needed across a wide range of zoonotic, endemic and epidemic (including pandemic) diseases to secure a healthy planet for humans, animals and the environment.

The study, led by Dr Johannes Charlier, project manager of DISCONTOOLS, and including an international team of animal health experts, assessed the current state of available control tools for 53 major infectious diseases of animals.

The researchers found that while easy to use and accurate diagnostics are available for many animal diseases, there is an urgent need for the development of stable and durable diagnostics that can differentiate infected animals from vaccinated animals and assess other disease characteristics like transmissibility, impact on animal productivity and welfare.

They add that there is also a pressing need to exploit rapid technological advances and to make diagnostics widely available and affordable. The scientists call for further research to improve the convenience of use and duration of immunity, and to establish performant marker vaccines.

The research highlights that the largest gap in animal pharmaceuticals is the threat of pathogens developing resistance to available drugs – particularly for bacterial and parasitic (protozoal, helminth and arthropod) pathogens.

Dr Charlier and his fellow researchers propose five research priorities for animal health that will help deliver a sustainable and healthy planet. They are vaccinology, antimicrobial resistance, climate mitigation and adaptation, digital health and epidemic preparedness.

Dr Charlier said, “Animal health is a prerequisite for global health, economic development, food security, food quality and poverty reduction while mitigating against climate change and biodiversity loss.

“If we are to achieve the SDGs, further research into appropriate control tools is needed to reduce the burden of animal diseases, including zoonoses, and to manage emerging diseases, pandemic threats and antimicrobial and antiparasitic resistance.”

The scientists used DISCONTOOLS – an open-access database and key resource for the STAR-IDAZ International Research Consortium, as well as for other funders of animal health research including trusts and industry bodies – to assess the current state of appropriate control tools for 53 major infectious diseases of animals.

DISCONTOOLS identifies the gaps in knowledge needing to be addressed to speed up the development of new DISease CONtrol TOOLS (diagnostics, vaccines and pharmaceuticals) and reduce the burden of animal diseases. This delivers benefits in terms of animal health and welfare, public health and a safe and secure food supply.

The DISCONTOOLS resource was then used to prioritise the list of infectious animal diseases where appropriate control tools are lacking and where addressing this need would have the greatest impact towards achieving the relevant SDGs.

Dr Charlier added, “For achieving maximal impact it is important to devote appropriate attention to both epidemic, zoonotic and endemic diseases. While epidemic diseases attract a lot of attention because of their sudden and devastating impact, the huge impact of more chronic diseases is less visible and hence often forgotten.

“Prevention of these diseases will not only require development of new technologies, but also sustained investment in diagnostic networks and research infrastructures, supply chains, capacity building, and international, trans-sectoral coordination.”

Roxane Feller, secretary general of AnimalhealthEurope (the trade association of the animal medicines industry) and management board member of DISCONTOOLS, supports the study and added, “The potential for transfer of infectious diseases between animals and people is a One Health challenge recognised at the highest level, signalling that it is high-time for all of us to move from firefighting to fire prevention.

“The impacts of animal disease stretch even further beyond public health, from devastating socio-economic effects for those who rely on livestock for income, to negative environmental effects through feed used and emissions created with no food output.

“By public and private investments in innovative early research, the animal health industry as a whole can focus on unlocking the secrets needed to develop new generations of vaccines, diagnostics and other therapies to prevent animal disease and avoid the negative effects.”

Alex Morrow from STAR-IDAZ IRC, said, “Animal diseases are, in most cases, global problems and so need a focused global approach to understand and control them. To speed up the innovation pipeline from basic science to the required products it is important to work together internationally and along the research pipeline focusing resources in a coordinated way on the critical knowledge gaps and identified product needs: we can’t all do everything.”

POVERTY DIET POVERTY

Mother’s ultra-processed food intake linked to obesity risk in children


Dietary guidelines should be refined and financial and social barriers removed to improve nutrition for women of child bearing age, say researchers

Peer-Reviewed Publication

BMJ

A mother’s consumption of ultra-processed foods appears to be linked to an increased risk of overweight or obesity in her offspring, irrespective of other lifestyle risk factors, suggests a US study published by The BMJ today.

The researchers say further study is needed to confirm these findings and to understand the factors that might be responsible. 

But they suggest that mothers might benefit from limiting their intake of ultra-processed foods, and that dietary guidelines should be refined and financial and social barriers removed to improve nutrition for women of child bearing age and reduce childhood obesity.

According to the World Health Organization, 39 million children were overweight or obese in 2020, leading to increased risks of heart disease, diabetes, cancers, and early death. 

Ultra-processed foods, such as packaged baked goods and snacks, fizzy drinks and sugary cereals, are commonly found in modern Western style diets and are associated with weight gain in adults. But it’s unclear whether there’s a link between a mother’s consumption of ultra-processed foods and her offspring’s body weight.

To explore this further, the researchers drew on data for 19,958 children born to 14,553 mothers (45% boys, aged 7-17 years at study enrollment) from the Nurses’ Health Study II (NHS II) and the Growing Up Today Study (GUTS I and II) in the United States.

The NHS II is an ongoing study tracking the health and lifestyles of 116,429 US female registered nurses aged 25-42 in 1989. From 1991, participants reported what they ate and drank, using validated food frequency questionnaires every four years.

The GUTS I study began in 1996 when 16,882 children (aged 8-15 years) of NHS II participants completed an initial health and lifestyle questionnaire and were monitored every year between 1997 and 2001, and every two years thereafter.

In 2004, 10,918 children (aged 7-17 years) of NHS II participants joined the extended GUTS II study and were followed up in 2006, 2008, and 2011, and every two years thereafter.

A range of other potentially influential factors, known to be strongly correlated with childhood obesity, were also taken into account. These included mother's weight (BMI), physical activity, smoking, living status (with partner or not), and partner’s education, as well as children’s ultra-processed food consumption, physical activity, and sedentary time.

Overall, 2471 (12%) children developed overweight or obesity during an average follow-up period of 4 years.

The results show that a mother’s ultra-processed food consumption was associated with an increased risk of overweight or obesity in her offspring. For example, a 26% higher risk was seen in the group with the highest maternal ultra-processed food consumption (12.1 servings/day) versus the lowest consumption group (3.4 servings/day).

In a separate analysis of 2790 mothers and 2925 children with information on diet from 3 months pre-conception to delivery (peripregnancy), the researchers found that peripregnancy ultra-processed food intake was not significantly associated with an increased risk of offspring overweight or obesity.

This is an observational study, so can’t establish cause and the researchers acknowledge that some of the observed risk may be due to other unmeasured factors, and that self-reported diet and weight measures might be subject to misreporting.

Other important limitations include the fact that some offspring participants were lost to follow-up, which resulted in a few of the analyses being underpowered, particularly those related to peripregnancy intake, and that mothers were predominantly white and from similar social and economic backgrounds, so the results may not apply to other groups.

Nevertheless, the study used data from several large ongoing studies with detailed dietary assessments over a relatively long period, and further analysis produced consistent associations, suggesting that the results are robust.

The researchers suggest no clear mechanism underlying these associations and say the area warrants further investigation.

Nevertheless, these data “support the importance of refining dietary recommendations and the development of programs to improve nutrition for women of reproductive age to promote offspring health,” they conclude.

 

Less than a third of FDA regulatory actions backed by research or public assessments

Regulators must be fully transparent about drug safety to ensure public trust in medicines, say experts

Peer-Reviewed Publication

BMJ

Less than a third of regulatory actions taken by the US Food and Drug Administration (FDA) are corroborated by published research findings or public assessments, finds a study published by The BMJ today.

The researchers say their findings, based on analysis of drug safety signals identified by the FDA from 2008 to 2019, suggest that either the FDA is taking regulatory actions based on evidence not made publicly available, or that more comprehensive safety evaluations might be needed when potential safety signals are identified.

Monitoring the safety of a medicine once it is available to patients (known as post-marketing pharmacovigilance) is essential for monitoring drug safety.

The US Food and Drug Administration (FDA) receives more than two million adverse event reports annually through its Adverse Event Reporting System (FAERS) and reviews all potential safety signals to determine if regulatory action is needed.

In 2007, the FDA Amendments Act required the FDA to publish quarterly reports of safety signals from FAERS, providing an opportunity to examine them to better understand this pharmacovigilance system.

A team of US researchers therefore decided to analyse safety signals identified within the FAERS database. They asked how often these signals resulted in regulatory actions and whether they were corroborated by additional research.

They found that from 2008 to 2019, 603 potential safety signals identified from the FAERS were reported by the FDA, of which about 70% were resolved, and nearly 80% led to regulatory action, most often changes to drug labeling. 

In a separate in-depth analysis of 82 potential safety signals reported in 2014-15, at least one relevant study was found in the literature for about 75% of the signals but most of these studies were case reports or case series. 

However, less than a third (30%) of regulatory actions were corroborated by at least one relevant published research study, and none of the regulatory actions were corroborated by a public assessment, reported by the Sentinel Initiative.

These are observational findings, and the researchers acknowledge some important limitations. For example, they did not evaluate regulatory actions taken in other countries in response to these safety signals, which might have informed the FDA’s actions, nor could they consider unpublished studies or other data accessible to the agency but not publicly available.

Nevertheless, they say these findings “highlight the continued need for rigorous post-market safety studies to strengthen the quality of evidence available at the time of regulatory action, as well as the importance of ongoing efforts to leverage real world data sources to evaluate and resolve signals identified from the FAERS and support FDA regulatory decisions.

In a linked editorial, experts argue that regulators should publish all evidence underlying their responses to drug safety signals to reduce harm and ensure public trust in medicines.

The covid-19 pandemic has exposed the tension underlying regulatory decisions and the public’s right to know about serious risks associated with medical interventions, they write. This same tension exists more broadly in medicine safety.

“Safety signals are an important step, but radical transparency about available evidence and the basis for regulatory judgments is needed to reduce harm caused by medicines, as is adequate follow-up to ensure safer use,” they conclude.

‘Authentic leaders’ could inspire athletes to curb aggression - study


Peer-Reviewed Publication

UNIVERSITY OF BIRMINGHAM

Sports coaches who display ‘authentic leadership’ qualities could find their athletes are less likely to act aggressively towards competitors, a new study reveals.

Researchers found such leadership could also enhance sport enjoyment and commitment – both vital qualities in sport as they can influence athletes’ continued participation, which tends to decline as sportspeople get older.

Publishing their findings today in Sport Exercise and Performance Psychology, experts from the Universities of Birmingham and Suffolk reveal that athletes training with coaches who display the attributes of an ‘authentic leader’, are less likely to act aggressively toward other players by committing intentional fouls and risking injuring their opponents.

Authentic leadership comprises four components:

  • Self-awareness – showing an understanding of one’s strengths and weaknesses and being aware of one’s impact on others;
  • Relational transparency – expressing one’s true thoughts and feelings, while minimising the expression of inappropriate emotions;
  • Balanced processing of information - considering objectively all relevant information, including their followers’ perspectives, before making a decision; and
  • Internalised moral perspective - exhibiting behaviours that are in line with one’s high moral standards, rather than being influenced by external pressures, thereby behaving ethically in one’s interactions with others.

Co-author Professor Maria Kavussanu, from the University of Birmingham, commented: Coaches are vital in influencing athletes’ development and must be encouraged to show high authentic leadership - being open with their athletes and including them in decision making, whilst behaving ethically, admitting to their mistakes, and speaking honestly.

“Our study demonstrates that if a coach displays the attributes of an authentic leader this could have a positive impact on their athletes - increasing athletes’ trust, commitment, and enjoyment, and decreasing aggression.

“Sport enjoyment is particularly important for continued participation in sport, which tends to decline with age. As such, coaches who display authentic behaviours can increase their athletes’ enjoyment, with significant positive implications for athletes’ physical and mental well-being.”

 In the first study of its kind, a total of 129 participants (76 of which were women) took part. All were sport science students at a British University and amateur athletes competing at a regional level. Using an experimental vignette methodology, researchers examined the effects of authentic leadership on athletes’ trust, enjoyment, commitment, and a range of morally relevant variables - aggression, cheating, and guilt for cheating and aggression.

Co-author Ella Malloy, from the University of Suffolk, commented: “When a coach demonstrates the attributes of an authentic leader, athletes are more likely to trust the coach and want to continue competing for them. In contrast, a coach displaying the behaviours of a non-authentic leader could diminish trust, enjoyment, and commitment among the athletes who train under them.”

ENDS

For more information and an embargoed copy of the research paper, please contact Tony Moran, International Communications Manager, University of Birmingham on +44 (0)782 783 2312 or t.moran@bham.ac.uk . For out-of-hours enquiries, please call +44 (0) 7789 921 165.

Note for Editors

  • The University of Birmingham is ranked amongst the world’s top 100 institutions, its work brings people from across the world to Birmingham, including researchers and teachers and more than 6,500 international students from over 150 countries.
  • ‘The Effects of Authentic Leadership on Athlete Outcomes: An Experimental Study’ - Ella Malloy, Maria Kavussanu, & Thomas Mackman is published in Sport Exercise and Performance Psychology.

The costs of caring for a graying population

Researchers from the University of Tsukuba uncover intriguing differences in spending on long-term care between Japan's municipalities

Peer-Reviewed Publication

UNIVERSITY OF TSUKUBA

Tsukuba, Japan—With the "graying population" phenomenon becoming widespread, many countries are facing the challenge of caring for their elderly population. In Japan, the country with the oldest population, a universal long-term care (LTC) insurance system was established in 2000 to help meet this need. Now, researchers from the University of Tsukuba have revealed the considerable variation between Japan's municipalities in spending on LTC.

In Japan, everyone older than 65 years of age and everyone with a health-related disability older than 40 years of age is eligible to receive LTC. Home-based, community, and facility-based care services are provided by the municipalities, which also act as insurers. There is a classification system for individuals with disabilities, ranging from "Care Level 1" for light disabilities to "Care Level 5" for severe disabilities.

"We were interested in seeing how much variation there is between municipalities in terms of their spending on LTC," says lead author of the study Professor Nanako Tamiya. "As health inequalities are currently widening in Japan, it is essential for us to understand how spending on LTC varies across the country."

To do this, the researchers examined municipality-level data from across Japan. They calculated per-capita spending on LTC and ran a series of statistical tests to determine which factors were behind the differences in spending they observed.

"At first glance, the differences between municipalities were surprisingly large," notes co-author Dr. Xueying Jin. "We found that the per-capita spending on LTC was four times higher in the highest-spending municipalities than in the lowest. We also noticed a 'west high, east low' trend, with higher spending in western Japan."

However, the researchers also found that the levels of supply and demand (e.g., the numbers of facilities and candidates for LTC) and other structural factors, which were obtained from publicly available municipality-level statistical sources, mostly explained the regional differences.

"We found that demand factors such as the proportion of the population over 85 years of age, the proportion of individuals certified as needing LTC, and the proportion of individuals who had been classified as belonging to a high care level played the greatest roles in driving up per-capita spending," says Associate Professor Masao Iwagami, another of the study's authors.

One of the study's major implications is that preventive efforts may help lower per-capita spending on LTC. Measures aimed at helping healthy individuals avoid developing lifestyle-related health problems should lead ultimately to lower numbers of individuals who require LTC. For those classified as having a disability, preventing their condition from deteriorating should result in lower numbers of individuals requiring high levels of care, and thus lower costs.

###
This study was supported by a grant-in-aid from Japan Society for the Promotion of Science; (JP22K17299). The funders had no role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Original Paper:

The article, "Regional variation in long-term care spending in Japan," was published in BMC Public Health at DOI: https://doi.org/10.1186/s12889-022-14194-6

Correspondence:

Visiting researcher JIN Xueying
Associate Professor IWAGAMI Masao
Professor TAMIYA Nanako

Research and Development Center for Health Services, Faculty of Medicine, University of Tsukuba

Related Link:

Faculty of Medicine

Research and Development Center for Health Services

 

On the fence: New research taps rancher expertise on living with carnivores

Peer-Reviewed Publication

S.J. & JESSIE E. QUINNEY COLLEGE OF NATURAL RESOURCES, UTAH STATE UNIVERSITY

Fence 

IMAGE: A WELL-DESIGNED FENCE CAN HELP TO PREVENT CONFLICTS WITH CARNIVORES, BUT WITH SO MANY OPTIONS FOR MATERIAL, PLACEMENT AND LOGISTICS, RESEARCHERS CAN STRUGGLE TO IDENTIFY WHAT STRATEGIES HAVE THE BEST CHANCE FOR SUCCESS. THEY TURNED TO RANCHERS FOR HELP. view more 

CREDIT: JAN CANTY

They say that good fences make good neighbors — especially true when you share space with gray wolves and grizzly bears.

In places like Wyoming and Idaho, ranchers have learned practical fencing strategies to help to reduce ill-fated encounters between hungry wildlife, vulnerable livestock and valuable produce. USU researchers are learning to take advantage of this hard-won knowledge, according to new research.

“Research about wildlife fencing is often missing on-the-ground knowledge,” said Julie Young of the Department of Wildland Resources and Ecology Center in the Quinney College of Natural Resources. “We wanted to reduce the cost and social burden of living with recovering wildlife populations, but we needed rancher input to do that.”

Given all possible options for fencing material, placement and logistics, the team wanted to zero in on strategies that had the best chance for success. They turned to the ranchers who have worked for decades in the “trenches” of wildlife conflict to help.

Young organized a group that included livestock producers, natural resource managers and university-based researchers. They met for four months — early in the morning to accommodate producers’ crack-of-dawn schedules. Participants were exposed to the reality of fencing designs and considerations across different scales: from hobby farms to orchard and apiary protection, to large cow-calf operations. The researchers learned about regulatory implications and obstacles to fencing on certain rangelands, which informed how they thought about adoption and the practicality of their research.

Once the research project began to take shape, they took the plan back to the ranchers for feedback.

“Our original design looked just at the effectiveness of fencing designs for preventing conflicts with agriculture or livestock. Concerns about human safety was something we initially overlooked,” said Rae Nickerson, coauthor on the research and a Ph.D. student in the Department of Wildland Resources and Ecology Center.

But fencing projects are often located near homestead areas, the researchers learned, and human safety was an issue important to the group.

“Some of the new things we learned from the process required flexibility in the process,” Nickerson said. “But it offered a unique way to prioritize our approach. It really took advantage of a diverse set of knowledge and experience.”

Researchers involved in creating preventative strategies for wildlife often view the issue from the perspective of the ecology of carnivores, but that’s not the sole priority of most producers. The researchers learned that they needed to integrate not just how and where fences were effective, but also how to make funding opportunities and paperwork more flexible for the producers. Ranching operations near increasing populations of large carnivores need information quickly, they said, before problems get out of control.

The group also planned strategies to get the word out about what ended up working, once the research was complete.

“Often the most promising and innovative tools aren’t circulated to managers and ranchers because they aren’t recorded or widely shared,” Young said. “The people who discover new and innovative tools to keep wildlife predators separate from livestock, grain storage and beehives often don’t have good ways to communicate their successes to others.”

Word-of-mouth can work, she said, but many of these folks are geographically separated from other producers confronted with the very same challenges. The team’s research will continue to look at the efficacy of using nonlethal tools to reduce wildlife conflicts and ways to disseminate best practices to more livestock owners.

Microbiologists improve taste of beer

Peer-Reviewed Publication

AMERICAN SOCIETY FOR MICROBIOLOGY


Washington, D.C. (October 4, 2022) –  Belgian investigators have improved the flavor of contemporary beer by identifying and engineering a gene that is responsible for much of the flavor of beer and some other alcoholic drinks. The research appears in Applied and Environmental Microbiology, a journal of the American Society for Microbiology.

For centuries, beer was brewed in open, horizontal vats. But in the 1970s, the industry switched to using large, closed vessels, which are much easier to fill, empty, and clean, enabling brewing of larger volumes and reducing costs. However, these modern methods produced inferior quality beer, due to insufficient flavor production.

During fermentation, yeast converts 50 percent of the sugar in the mash to ethanol, and the other 50 percent to carbon dioxide. The problem: the carbon dioxide pressurizes these closed vessels, dampening flavor.

Johan Thevelein, Ph.D., an emeritus professor of Molecular Cell Biology at Katholieke Universiteit, and his team had pioneered technology for identifying genes responsible for commercially important traits in yeast. They applied this technology to identify the gene(s) responsible for flavor in beer, by screening large numbers of yeast strains to evaluate which did the best job of preserving flavor under pressure. They focused on a gene for a banana-like flavor “because it is one of the most important flavors present in beer, as well as in other alcoholic drinks,” said Thevelein, who is also founder of NovelYeast, which collaborates with other companies in industrial biotechnology.

“To our surprise, we identified a single mutation in the MDS3 gene, which codes for a regulator apparently involved in production of isoamyl acetate, the source of the banana-like flavor that was responsible for most of the pressure tolerance in this specific yeast strain,” said Thevelein.

Thevelein and coworkers then used CRISPR/Cas9, a revolutionary gene editing technology, to engineer this mutation in other brewing strains, which similarly improved their tolerance of carbon dioxide pressure, enabling full flavor. “That demonstrated the scientific relevance of our findings, and their commercial potential,” said Thevelein.

“The mutation is the first insight into understanding the mechanism by which high carbon dioxide pressure may compromise beer flavor production,” said Thevelein, who noted that the MDS3 protein is likely a component of an important regulatory pathway that may play a role in carbon dioxide inhibition of banana flavor production, adding, “how it does that is not clear.”

The technology has also been successful in identifying genetic elements important for rose flavor production by yeast in alcoholic drinks, as well as other commercially important traits, such as glycerol production and thermotolerance.

Read the Study

Thanks to new technologies and cheaper sequencing, researchers are now able to dig deep into the microbes that supply fermentation for so many wonderful foods and drinks. Revisit ASM's mSystems Editors in Conversation podcast episode to learn more. 

###
The American Society for Microbiology is one of the largest single life science societies, composed of more than 30,000 scientists and health professionals. ASM's mission is to promote and advance the microbial sciences. 

ASM advances the microbial sciences through conferences, publications, certifications, educational opportunities and advocacy efforts. It enhances laboratory capacity around the globe through training and resources. It provides a network for scientists in academia, industry and clinical settings. Additionally, ASM promotes a deeper understanding of the microbial sciences to diverse audiences

Great Salt Lake on path to hyper-salinity, mirroring Iranian lake, new research shows

Peer-Reviewed Publication

S.J. & JESSIE E. QUINNEY COLLEGE OF NATURAL RESOURCES, UTAH STATE UNIVERSITY


Great Salt Lake 

IMAGE: THE GREAT SALT LAKE IS GETTING SALTIER, CREATING A SERIOUS THREAT TO THE ECOSYSTEMS AND THE ECONOMIES THAT DEPEND ON IT. NEW RESEARCH FROM WAYNE WURTSBAUGH EXAMINES THE TRAJECTORY THE TWO HALVES OF THE LAKE MIGHT TAKE ON A PATH TO HYPER-SALINITY. view more 

CREDIT: USGS

Starved for freshwater, the Great Salt Lake is getting saltier. The lake is losing sources of freshwater input to agriculture, urban growth and drought, and the drawdown is causing salt concentrations to spike beyond even the tolerance of brine shrimp and brine flies, according to Wayne Wurtsbaugh from Watershed Sciences in the Quinney College of Natural Resources

Deciphering the ecological and economic consequences of this change is complex and unprecedented, and experts are closely observing another stressed saline lake for clues on what to expect next—Lake Urmia in Iran. This “sister lake” offers obvious, and troubling, parallels to the fate of the Great Salt Lake, according to new research from Wurtsbaugh and Somayeh Sima from Tarbiat Modares University in Tehran.

The history of both lakes has moved along similar trajectories, though at different paces. As less freshwater moves through connected rivers and streams into these lakes, natural salts become more and more concentrated in the water. Native brine flies and brine shrimp tolerate salt, but when saline levels reach certain extreme concentrations—at times reaching saturation—even animals and plants specially adapted to saline environments can struggle. This means as well that millions of migratory birds that depend on these food sources will also struggle, starve, or leave.

Over the decades, expanding urban populations in northern Utah have claimed more freshwater for crops, lawns and faucets, putting gradually intensifying stress on the ecosystem. Now a 20-year drought is pushing salinity levels further towards untenable levels, Wurtbaugh said.

At the Great Salt Lake a causeway divides the lake in distinct halves. With no freshwater inputs, the northern arm of the lake (Gunnison Bay) has become the saltiest with levels at saturation. A transfer of salt into the north arm has allowed the south arm (Gilbert Bay) to stay at a concentration range that allows brine shrimp and brine flies to tolerate the salinity. But salinities in the south are now also increasing to levels stressful for even those hardy species. 

The Great Salt Lake and Lake Urmia in Iran were once remarkably similar in size, depth, salinity and geographic setting. High rates of urban growth there also fueled demand for irrigated agriculture and human uses, putting extreme stress on the ecosystem. Compared to the Great Salt Lake, the fate of Lake Urmia is on fast-forward.

Over just 20 years, diversions caused Urmia’s salinity to jump from 190 grams of salt per liter of water to over 350 grams, said Sima. (For comparison, ocean water has a salinity of around 35 grams per liter.) The decline in Lake Urmia’s ecosystem has been precipitous and easy to recognize. It has lost nearly all of its brine shrimp. How long brine shrimp can endure in increasingly salty water in the Great Salt Lake is a question researchers are eager to understand, especially for the south arm where salt concentrations are high, but still sustaining some shrimp. 

Gilbert Bay in the north arm of Great Salt Lake has reached an astounding 330 grams per liter (27% salt), and brine shrimp there are nearly absent, halting harvest there by the $70 million brine shrimp industry, said Wurtsbaugh. Now, shrimp harvest in the south arm is also being threatened by increasing salinities.  Brine shrimp prefer salt levels at a comfortable 75-160 grams per liter. Brine fly larvae can tolerate higher saline concentrations, but even this uber-hardy species start to feel the pinch when things are so over-the-top. 

“Brine fly larvae get smaller at these higher salt levels, signaling ecological stress,” said Wurtsbaugh. “A combined collapse of these two organisms could have catastrophic ecological consequences for migratory bird populations and for the economics of the lake.”

Managers still have some capacity to regulate flow of salt from the north to the south arms of the lake using an underwater berm at a break in the causeway. This flow is used to manage competing needs of mineral extraction companies on the lake and the brine shrimp harvesting industry. But if water development and climate change trigger further losses in water levels, even that option will become limited, Wurtsbaugh said.

Lake Urmia has already lost most of its ecological and cultural function—but the Great Salt Lake has not yet crossed that precipice, say the authors. The ongoing crises at Great Salt Lake and Lake Urmia are not unique—around the globe, other saline lakes are facing a similar crisis and are entirely desiccated or quickly losing water, said Wurtsbaugh. But communities are noticing, which gives him hope. Making any progress will require considerable sacrifice from the water users if the lakes are to be sustained, said Wurtsbaugh.