Monday, October 24, 2022

Power supply: Understanding unstable grids

A sustainable energy supply requires the expansion of power grids. However, new transmission lines can also lead to grids becoming more unstable rather than more stable, as would be expected. This phenomenon is referred to as the Braess paradox.

Peer-Reviewed Publication

KARLSRUHER INSTITUT FÜR TECHNOLOGIE (KIT)

A stable power grid is fundamental to a reliable and sustainable energy system. (Photo: Markus Breig, KIT) 

IMAGE: A STABLE POWER GRID IS FUNDAMENTAL TO A RELIABLE AND SUSTAINABLE ENERGY SYSTEM. (PHOTO: MARKUS BREIG, KIT) view more 

CREDIT: MARKUS BREIG, KIT

A sustainable energy supply requires the expansion of power grids. However, new transmission lines can also lead to grids becoming more unstable rather than more stable, as would be expected. This phenomenon is referred to as the Braess paradox. For the first time, an international team, including researchers from the Karlsruhe Institute of Technology (KIT), has now simulated this phenomenon in detail for power grids, demonstrated it on a larger scale, and developed a prediction tool, which is to support grid operators in decision-making. The researchers report in the journal Nature Communications. (DOI: 10.1038/s41467-022-32917-6)

The sustainable transformation of the energy system requires an expansion of the grids to integrate renewable sources and transport electricity over long distances. Such expansion calls for large investments and aims to make the grids more stable. However, by upgrading existing lines or adding new ones, the grid may become more unstable rather than more stable, which results in power outages. "We then speak of the Braess paradox. This phenomenon states that an additional option leads to a worsening of the overall situation instead of to an improvement," says Dr. Benjamin Schäfer, head of the Data-driven Analysis of Complex Systems (DRACOS) research group at the KIT Institute for Automation and Applied Informatics.

The phenomenon is named after the German mathematician Dietrich Braess, who first discussed it for road networks: Under certain conditions, the construction of a new road can increase the travel time for all road users. This effect has been observed in traffic systems and been discussed for biological systems. For power grids, it has so far only been predicted theoretically and illustrated on a very small scale.

Researchers Simulate German Power Grid Including Planned Expansions

Researchers led by Dr. Schäfer now have simulated the phenomenon in detail for power grids for the first time and demonstrated it on a larger scale. They simulated the German power grid, including planned reinforcements and expansions. In an experimental setup in the laboratory showing the Braess paradox in an AC grid, the researchers observed the phenomenon in simulation and in experiment, placing special emphasis on circular flows. The latter are crucial to understanding the Braess paradox: A power line is improved, for example, by reducing its resistance, and can then carry more current. "Due to conservation laws, this gives rise to a new circular flow, and more current then flows in some lines and less in others," Schäfer explains. "This becomes a problem when the most loaded line has to carry even more current, becomes overloaded, and eventually has to be shut down. This makes the grid more unstable and, in the worst case, it collapses."

Intuitive Understanding Enables Fast Decisions

Most power grids have sufficient spare capacity to withstand the Braess paradox. When building new lines and during operation, grid operators examine all possible scenarios. However, when decisions have to be made at short notice, for example to shut down lines or shift power plant output, there is not always enough time to run through all scenarios. "Then you need an intuitive understanding of circular flows to be able to assess when the Braess paradox occurs and thus make the right decisions quickly," says Schäfer. Together with an international and interdisciplinary team, the scientist has therefore developed a prediction tool to help grid operators take the Braess paradox into account in their decisions. “The results of the research have enabled a theoretical understanding of the Braess paradox and provided practical guidelines for planning grid expansions sensibly and supporting grid stability,” Schäfer says. (or)

Original Publication

Benjamin Schäfer, Thiemo Pesch, Debsankha Manik, Julian Gollenstede,Guosong Lin, Hans-Peter Beck, Dirk Witthaut, and Marc Timme: Understanding Braess' Paradox in power grids. Nature Communications, 2022. DOI: 10.1038/s41467-022-32917-6,https://www.nature.com/articles/s41467-022-32917-6

More about the KIT Energy Center

Contact for This Press Release
Sandra Wiebe, Press Officer, phone: +49 721 608-41172, email: sandra.wiebe@kit.edu

Being “The Research University in the Helmholtz Association”, KIT creates and imparts knowledge for the society and the environment. It is the objective to make significant contributions to the global challenges in the fields of energy, mobility, and information. For this, about 9,800 employees cooperate in a broad range of disciplines in natural sciences, engineering sciences, economics, and the humanities and social sciences. KIT prepares its 22,300 students for responsible tasks in society, industry, and science by offering research-based study programs. Innovation efforts at KIT build a bridge between important scientific findings and their application for the benefit of society, economic prosperity, and the preservation of our natural basis of life. KIT is one of the German universities of excellence.

This press release is available on the Internet at http://www.kit.edu/kit/english/press_releases.php

Why is there a genetic risk for brain disorders? Neandertal DNA may provide some answers

Peer-Reviewed Publication

ESTONIAN RESEARCH COUNCIL

It has been known for a long time that human brain disorders such as neurological or psychiatric diseases run in families, suggesting some heritability. In line with this hypothesis, genetic risk factors for developing these illnesses have been identified. However, fundamental questions about the evolutionary drivers have remained elusive. In other words, why are genetic variants that increase the risk for diseases not eliminated in the course of evolution?

To answer these questions has been notoriously difficult. However, new discoveries about events in the deep human past have handed scientists new tools to start to unravel these mysteries: when modern humans moved out of Africa >60,000 years ago, they met and mixed with other archaic humans such as Neandertals. Around 40% of the Neandertal genome can still be found in present-day non-Africans, and each individual still carries ~2% of Neandertal DNA. Some of the archaic genetic variants may have conferred benefits at some point in our evolutionary past. Today, scientists can use this information to learn more about the impact of these genetic variants on human behaviour and the risk of developing diseases.

Using this approach, a new study from an international team led by researchers from the University of Tartu, Charité Berlin and the Amsterdam UMC analysed Neandertal DNA associations with a large variety of more than a hundred brain disorders and traits such as sleep, smoking or alcohol use in the UK Biobank with the aim to narrow down the specific contribution of Neandertal DNA to variation in behavioural features in people today.

The study found that while Neandertal DNA showed over-proportional numbers of associations with several traits that are associated with central nervous system diseases, the diseases themselves did not show any significant numbers of Neandertal DNA associations. Among the traits with the strongest Neandertal DNA contribution were smoking habits, alcohol consumption and sleeping patterns. Using data from other cohorts such as the Estonian Biobank, the Netherlands Study of Depression and Anxiety, FinnGen, Biobank Japan and deCode, several of these results could be replicated. Of specific note were two independent top-risk Neandertal variants for a positive smoking status that were found in the UK Biobank and Biobank Japan respectively.

“Our results suggest that Neandertals carried multiple variants that substantially increase the smoking risk in people today. It remains unclear what phenotypic effects these variants had in Neandertals. However, these results provide interesting candidates for further functional testing and will potentially help us in the future to better understand Neandertal-specific biology,” said Michael Dannemann, associate professor of evolutionary genomics at the University of Tartu and the lead author of this study.

“The significant associations of Neandertal DNA with alcohol and smoking habits might help us to unravel the evolutionary origin of addictive and reward-seeking behaviour,” added Stefan M Gold, professor of neuropsychiatry at Charité, Berlin, who co-led this study. “It is important to note that sleep problems, alcohol and nicotine use have consistently been identified as common risk factors for a range of neurological and psychiatric disorders. On the other hand, there are some intriguing findings from anthropology that have suggested some social benefits of higher tolerance to these substances in hunter-gatherers. Thus, our findings support the hypothesis that it is not brain diseases themselves that have evolutionary explanations but that natural selection shapes traits that make us vulnerable to them in the modern context.”

“Neandertals populated parts of Eurasia already more than 100,000 years before modern humans went out of Africa to populate the rest of the world. The high frequency of some of the variants that are associated with varying sleeping patterns might suggest that these have been advantageous outside of Africa – an environment that is defined, for example, by different levels of seasonality and UV light exposures than the environment that modern humans evolved in,” added Dannemann.

Liver cancer cases and deaths projected to rise by more than 55% by 2040


To avoid this increase, countries must achieve at least a 3% annual decrease in liver cancer incidence and mortality rates according to a new report published in the Journal of Hepatology


Peer-Reviewed Publication

ELSEVIER

Global burden of primary liver cancer 

IMAGE: MAIN FINDINGS ON THE GLOBAL BURDEN OF LIVER CANCER IN 2020 AND PREDICTIONS TO 2040. view more 

CREDIT: JOURNAL OF HEPATOLOGY

Amsterdam, October 6, 2022 – A new analysis reveals that primary liver cancer was among the top three causes of cancer death in 46 countries in 2020 and the number of people diagnosed with or dying from primary liver cancer per year could rise by more than 55% by 2040. Investigators call for efforts to control the disease to be prioritized in a new study in the Journal of Hepatology, published by Elsevier.

“Liver cancer causes a huge burden of disease globally each year,” commented senior author Isabelle Soerjomataram, MD, PhD, International Agency for Research on Cancer (IARC/WHO), Cancer Surveillance Branch, Lyon, France. “It is also largely preventable if control efforts are prioritized — major risk factors include hepatitis B virus, hepatitis C virus, alcohol consumption, excess body weight, and metabolic conditions including type 2 diabetes.”

“In light of the availability of new and improved global cancer incidence and mortality estimates, we wanted to provide the most up-to-date assessment of the burden of liver cancer and develop an essential tool for national liver cancer control planning,” explained lead author Harriet Rumgay, PhD candidate, International Agency for Research on Cancer (IARC/WHO), Cancer Surveillance Branch, Lyon, France. “In this analysis we describe where liver cancer ranks among all cancer types for cancer diagnoses and deaths in nations across the world. We also present predictions of the future liver cancer burden to 2040.”

Investigators extracted data on primary liver cancer cases and deaths from the International Agency for Research on Cancer’s GLOBOCAN 2020 database, which produces cancer incidence and mortality estimates for 36 cancer types in 185 countries worldwide. The predicted change in the number of cancer cases or deaths by the year 2040 was estimated using population projections produced by the United Nations.

Results showed that in 2020, an estimated 905,700 individuals were diagnosed with liver cancer and 830,200 died from liver cancer globally. According to these data, liver cancer is now among the top three causes of cancer death in 46 countries and is among the top five causes of cancer death in nearly 100 countries including several high-income countries.

Liver cancer incidence and mortality rates were highest in Eastern Asia, Northern Africa, and South-Eastern Asia. Investigators predict the annual number of new cases and deaths from liver cancer will rise by more than 55% over the next 20 years, assuming current rates do not change. The predicted rise in cases will increase the need for resources to manage care of liver cancer patients.

The researchers were alarmed to find that the number of cases and deaths from liver cancer will continue to increase year on year. They caution that in order to avoid this rise in cases and deaths, countries across the world must achieve at least a 3% annual decrease in liver cancer incidence and mortality rates through preventive measures.

These estimates provide a snapshot of the global burden of liver cancer and demonstrate the importance of improving and reinforcing liver cancer prevention measures.

“We are at a turning point in liver cancer prevention as successes in hepatitis B virus and hepatitis C virus control efforts will be reflected in rates of liver cancer in the next few decades,” noted Dr. Soerjomataram. “These efforts must be sustained and reinforced especially considering the disruption caused by the COVID-19 pandemic on certain hepatitis B and C virus control efforts."

The authors call for public health officials to prepare for the predicted increase in demand for resources to manage the care of liver cancer patients throughout the cancer pathway, including improved access to palliative care due to the predicted growing number of liver cancer patients, and to reinforce current liver cancer prevention measures such as immunization, testing, and treatment for hepatitis B virus; population-wide testing and treatment for hepatitis C virus infection; reduction of population alcohol consumption; and curbing the rise in diabetes and obesity prevalence.

“The number of people diagnosed with or dying from liver cancer per year could increase by nearly 500,000 cases or deaths by 2040 unless we achieve a substantial decrease in liver cancer rates through primary prevention,” concluded Dr. Soerjomataram.

 

New research identifies lack of appropriate control tools for many major infectious diseases of animals

New research published in The Lancet Planetary Health has identified a lack of appropriate control tools for many infectious diseases of animals that can have a significant impact upon the UN Sustainable Development Goals (SDGs).

Peer-Reviewed Publication

CABI

New research published in The Lancet Planetary Health has identified a lack of appropriate control tools for many infectious diseases of animals that can have a significant impact upon the UN Sustainable Development Goals (SDGs).

International efforts should focus on developing control tools for a range of priority infectious diseases of animals, including Nipah virus infection, African swine fever, foot and mouth disease and bovine tuberculosis, scientists say, but progress is needed across a wide range of zoonotic, endemic and epidemic (including pandemic) diseases to secure a healthy planet for humans, animals and the environment.

The study, led by Dr Johannes Charlier, project manager of DISCONTOOLS, and including an international team of animal health experts, assessed the current state of available control tools for 53 major infectious diseases of animals.

The researchers found that while easy to use and accurate diagnostics are available for many animal diseases, there is an urgent need for the development of stable and durable diagnostics that can differentiate infected animals from vaccinated animals and assess other disease characteristics like transmissibility, impact on animal productivity and welfare.

They add that there is also a pressing need to exploit rapid technological advances and to make diagnostics widely available and affordable. The scientists call for further research to improve the convenience of use and duration of immunity, and to establish performant marker vaccines.

The research highlights that the largest gap in animal pharmaceuticals is the threat of pathogens developing resistance to available drugs – particularly for bacterial and parasitic (protozoal, helminth and arthropod) pathogens.

Dr Charlier and his fellow researchers propose five research priorities for animal health that will help deliver a sustainable and healthy planet. They are vaccinology, antimicrobial resistance, climate mitigation and adaptation, digital health and epidemic preparedness.

Dr Charlier said, “Animal health is a prerequisite for global health, economic development, food security, food quality and poverty reduction while mitigating against climate change and biodiversity loss.

“If we are to achieve the SDGs, further research into appropriate control tools is needed to reduce the burden of animal diseases, including zoonoses, and to manage emerging diseases, pandemic threats and antimicrobial and antiparasitic resistance.”

The scientists used DISCONTOOLS – an open-access database and key resource for the STAR-IDAZ International Research Consortium, as well as for other funders of animal health research including trusts and industry bodies – to assess the current state of appropriate control tools for 53 major infectious diseases of animals.

DISCONTOOLS identifies the gaps in knowledge needing to be addressed to speed up the development of new DISease CONtrol TOOLS (diagnostics, vaccines and pharmaceuticals) and reduce the burden of animal diseases. This delivers benefits in terms of animal health and welfare, public health and a safe and secure food supply.

The DISCONTOOLS resource was then used to prioritise the list of infectious animal diseases where appropriate control tools are lacking and where addressing this need would have the greatest impact towards achieving the relevant SDGs.

Dr Charlier added, “For achieving maximal impact it is important to devote appropriate attention to both epidemic, zoonotic and endemic diseases. While epidemic diseases attract a lot of attention because of their sudden and devastating impact, the huge impact of more chronic diseases is less visible and hence often forgotten.

“Prevention of these diseases will not only require development of new technologies, but also sustained investment in diagnostic networks and research infrastructures, supply chains, capacity building, and international, trans-sectoral coordination.”

Roxane Feller, secretary general of AnimalhealthEurope (the trade association of the animal medicines industry) and management board member of DISCONTOOLS, supports the study and added, “The potential for transfer of infectious diseases between animals and people is a One Health challenge recognised at the highest level, signalling that it is high-time for all of us to move from firefighting to fire prevention.

“The impacts of animal disease stretch even further beyond public health, from devastating socio-economic effects for those who rely on livestock for income, to negative environmental effects through feed used and emissions created with no food output.

“By public and private investments in innovative early research, the animal health industry as a whole can focus on unlocking the secrets needed to develop new generations of vaccines, diagnostics and other therapies to prevent animal disease and avoid the negative effects.”

Alex Morrow from STAR-IDAZ IRC, said, “Animal diseases are, in most cases, global problems and so need a focused global approach to understand and control them. To speed up the innovation pipeline from basic science to the required products it is important to work together internationally and along the research pipeline focusing resources in a coordinated way on the critical knowledge gaps and identified product needs: we can’t all do everything.”

POVERTY DIET POVERTY

Mother’s ultra-processed food intake linked to obesity risk in children


Dietary guidelines should be refined and financial and social barriers removed to improve nutrition for women of child bearing age, say researchers

Peer-Reviewed Publication

BMJ

A mother’s consumption of ultra-processed foods appears to be linked to an increased risk of overweight or obesity in her offspring, irrespective of other lifestyle risk factors, suggests a US study published by The BMJ today.

The researchers say further study is needed to confirm these findings and to understand the factors that might be responsible. 

But they suggest that mothers might benefit from limiting their intake of ultra-processed foods, and that dietary guidelines should be refined and financial and social barriers removed to improve nutrition for women of child bearing age and reduce childhood obesity.

According to the World Health Organization, 39 million children were overweight or obese in 2020, leading to increased risks of heart disease, diabetes, cancers, and early death. 

Ultra-processed foods, such as packaged baked goods and snacks, fizzy drinks and sugary cereals, are commonly found in modern Western style diets and are associated with weight gain in adults. But it’s unclear whether there’s a link between a mother’s consumption of ultra-processed foods and her offspring’s body weight.

To explore this further, the researchers drew on data for 19,958 children born to 14,553 mothers (45% boys, aged 7-17 years at study enrollment) from the Nurses’ Health Study II (NHS II) and the Growing Up Today Study (GUTS I and II) in the United States.

The NHS II is an ongoing study tracking the health and lifestyles of 116,429 US female registered nurses aged 25-42 in 1989. From 1991, participants reported what they ate and drank, using validated food frequency questionnaires every four years.

The GUTS I study began in 1996 when 16,882 children (aged 8-15 years) of NHS II participants completed an initial health and lifestyle questionnaire and were monitored every year between 1997 and 2001, and every two years thereafter.

In 2004, 10,918 children (aged 7-17 years) of NHS II participants joined the extended GUTS II study and were followed up in 2006, 2008, and 2011, and every two years thereafter.

A range of other potentially influential factors, known to be strongly correlated with childhood obesity, were also taken into account. These included mother's weight (BMI), physical activity, smoking, living status (with partner or not), and partner’s education, as well as children’s ultra-processed food consumption, physical activity, and sedentary time.

Overall, 2471 (12%) children developed overweight or obesity during an average follow-up period of 4 years.

The results show that a mother’s ultra-processed food consumption was associated with an increased risk of overweight or obesity in her offspring. For example, a 26% higher risk was seen in the group with the highest maternal ultra-processed food consumption (12.1 servings/day) versus the lowest consumption group (3.4 servings/day).

In a separate analysis of 2790 mothers and 2925 children with information on diet from 3 months pre-conception to delivery (peripregnancy), the researchers found that peripregnancy ultra-processed food intake was not significantly associated with an increased risk of offspring overweight or obesity.

This is an observational study, so can’t establish cause and the researchers acknowledge that some of the observed risk may be due to other unmeasured factors, and that self-reported diet and weight measures might be subject to misreporting.

Other important limitations include the fact that some offspring participants were lost to follow-up, which resulted in a few of the analyses being underpowered, particularly those related to peripregnancy intake, and that mothers were predominantly white and from similar social and economic backgrounds, so the results may not apply to other groups.

Nevertheless, the study used data from several large ongoing studies with detailed dietary assessments over a relatively long period, and further analysis produced consistent associations, suggesting that the results are robust.

The researchers suggest no clear mechanism underlying these associations and say the area warrants further investigation.

Nevertheless, these data “support the importance of refining dietary recommendations and the development of programs to improve nutrition for women of reproductive age to promote offspring health,” they conclude.

 

Less than a third of FDA regulatory actions backed by research or public assessments

Regulators must be fully transparent about drug safety to ensure public trust in medicines, say experts

Peer-Reviewed Publication

BMJ

Less than a third of regulatory actions taken by the US Food and Drug Administration (FDA) are corroborated by published research findings or public assessments, finds a study published by The BMJ today.

The researchers say their findings, based on analysis of drug safety signals identified by the FDA from 2008 to 2019, suggest that either the FDA is taking regulatory actions based on evidence not made publicly available, or that more comprehensive safety evaluations might be needed when potential safety signals are identified.

Monitoring the safety of a medicine once it is available to patients (known as post-marketing pharmacovigilance) is essential for monitoring drug safety.

The US Food and Drug Administration (FDA) receives more than two million adverse event reports annually through its Adverse Event Reporting System (FAERS) and reviews all potential safety signals to determine if regulatory action is needed.

In 2007, the FDA Amendments Act required the FDA to publish quarterly reports of safety signals from FAERS, providing an opportunity to examine them to better understand this pharmacovigilance system.

A team of US researchers therefore decided to analyse safety signals identified within the FAERS database. They asked how often these signals resulted in regulatory actions and whether they were corroborated by additional research.

They found that from 2008 to 2019, 603 potential safety signals identified from the FAERS were reported by the FDA, of which about 70% were resolved, and nearly 80% led to regulatory action, most often changes to drug labeling. 

In a separate in-depth analysis of 82 potential safety signals reported in 2014-15, at least one relevant study was found in the literature for about 75% of the signals but most of these studies were case reports or case series. 

However, less than a third (30%) of regulatory actions were corroborated by at least one relevant published research study, and none of the regulatory actions were corroborated by a public assessment, reported by the Sentinel Initiative.

These are observational findings, and the researchers acknowledge some important limitations. For example, they did not evaluate regulatory actions taken in other countries in response to these safety signals, which might have informed the FDA’s actions, nor could they consider unpublished studies or other data accessible to the agency but not publicly available.

Nevertheless, they say these findings “highlight the continued need for rigorous post-market safety studies to strengthen the quality of evidence available at the time of regulatory action, as well as the importance of ongoing efforts to leverage real world data sources to evaluate and resolve signals identified from the FAERS and support FDA regulatory decisions.

In a linked editorial, experts argue that regulators should publish all evidence underlying their responses to drug safety signals to reduce harm and ensure public trust in medicines.

The covid-19 pandemic has exposed the tension underlying regulatory decisions and the public’s right to know about serious risks associated with medical interventions, they write. This same tension exists more broadly in medicine safety.

“Safety signals are an important step, but radical transparency about available evidence and the basis for regulatory judgments is needed to reduce harm caused by medicines, as is adequate follow-up to ensure safer use,” they conclude.

‘Authentic leaders’ could inspire athletes to curb aggression - study


Peer-Reviewed Publication

UNIVERSITY OF BIRMINGHAM

Sports coaches who display ‘authentic leadership’ qualities could find their athletes are less likely to act aggressively towards competitors, a new study reveals.

Researchers found such leadership could also enhance sport enjoyment and commitment – both vital qualities in sport as they can influence athletes’ continued participation, which tends to decline as sportspeople get older.

Publishing their findings today in Sport Exercise and Performance Psychology, experts from the Universities of Birmingham and Suffolk reveal that athletes training with coaches who display the attributes of an ‘authentic leader’, are less likely to act aggressively toward other players by committing intentional fouls and risking injuring their opponents.

Authentic leadership comprises four components:

  • Self-awareness – showing an understanding of one’s strengths and weaknesses and being aware of one’s impact on others;
  • Relational transparency – expressing one’s true thoughts and feelings, while minimising the expression of inappropriate emotions;
  • Balanced processing of information - considering objectively all relevant information, including their followers’ perspectives, before making a decision; and
  • Internalised moral perspective - exhibiting behaviours that are in line with one’s high moral standards, rather than being influenced by external pressures, thereby behaving ethically in one’s interactions with others.

Co-author Professor Maria Kavussanu, from the University of Birmingham, commented: Coaches are vital in influencing athletes’ development and must be encouraged to show high authentic leadership - being open with their athletes and including them in decision making, whilst behaving ethically, admitting to their mistakes, and speaking honestly.

“Our study demonstrates that if a coach displays the attributes of an authentic leader this could have a positive impact on their athletes - increasing athletes’ trust, commitment, and enjoyment, and decreasing aggression.

“Sport enjoyment is particularly important for continued participation in sport, which tends to decline with age. As such, coaches who display authentic behaviours can increase their athletes’ enjoyment, with significant positive implications for athletes’ physical and mental well-being.”

 In the first study of its kind, a total of 129 participants (76 of which were women) took part. All were sport science students at a British University and amateur athletes competing at a regional level. Using an experimental vignette methodology, researchers examined the effects of authentic leadership on athletes’ trust, enjoyment, commitment, and a range of morally relevant variables - aggression, cheating, and guilt for cheating and aggression.

Co-author Ella Malloy, from the University of Suffolk, commented: “When a coach demonstrates the attributes of an authentic leader, athletes are more likely to trust the coach and want to continue competing for them. In contrast, a coach displaying the behaviours of a non-authentic leader could diminish trust, enjoyment, and commitment among the athletes who train under them.”

ENDS

For more information and an embargoed copy of the research paper, please contact Tony Moran, International Communications Manager, University of Birmingham on +44 (0)782 783 2312 or t.moran@bham.ac.uk . For out-of-hours enquiries, please call +44 (0) 7789 921 165.

Note for Editors

  • The University of Birmingham is ranked amongst the world’s top 100 institutions, its work brings people from across the world to Birmingham, including researchers and teachers and more than 6,500 international students from over 150 countries.
  • ‘The Effects of Authentic Leadership on Athlete Outcomes: An Experimental Study’ - Ella Malloy, Maria Kavussanu, & Thomas Mackman is published in Sport Exercise and Performance Psychology.

The costs of caring for a graying population

Researchers from the University of Tsukuba uncover intriguing differences in spending on long-term care between Japan's municipalities

Peer-Reviewed Publication

UNIVERSITY OF TSUKUBA

Tsukuba, Japan—With the "graying population" phenomenon becoming widespread, many countries are facing the challenge of caring for their elderly population. In Japan, the country with the oldest population, a universal long-term care (LTC) insurance system was established in 2000 to help meet this need. Now, researchers from the University of Tsukuba have revealed the considerable variation between Japan's municipalities in spending on LTC.

In Japan, everyone older than 65 years of age and everyone with a health-related disability older than 40 years of age is eligible to receive LTC. Home-based, community, and facility-based care services are provided by the municipalities, which also act as insurers. There is a classification system for individuals with disabilities, ranging from "Care Level 1" for light disabilities to "Care Level 5" for severe disabilities.

"We were interested in seeing how much variation there is between municipalities in terms of their spending on LTC," says lead author of the study Professor Nanako Tamiya. "As health inequalities are currently widening in Japan, it is essential for us to understand how spending on LTC varies across the country."

To do this, the researchers examined municipality-level data from across Japan. They calculated per-capita spending on LTC and ran a series of statistical tests to determine which factors were behind the differences in spending they observed.

"At first glance, the differences between municipalities were surprisingly large," notes co-author Dr. Xueying Jin. "We found that the per-capita spending on LTC was four times higher in the highest-spending municipalities than in the lowest. We also noticed a 'west high, east low' trend, with higher spending in western Japan."

However, the researchers also found that the levels of supply and demand (e.g., the numbers of facilities and candidates for LTC) and other structural factors, which were obtained from publicly available municipality-level statistical sources, mostly explained the regional differences.

"We found that demand factors such as the proportion of the population over 85 years of age, the proportion of individuals certified as needing LTC, and the proportion of individuals who had been classified as belonging to a high care level played the greatest roles in driving up per-capita spending," says Associate Professor Masao Iwagami, another of the study's authors.

One of the study's major implications is that preventive efforts may help lower per-capita spending on LTC. Measures aimed at helping healthy individuals avoid developing lifestyle-related health problems should lead ultimately to lower numbers of individuals who require LTC. For those classified as having a disability, preventing their condition from deteriorating should result in lower numbers of individuals requiring high levels of care, and thus lower costs.

###
This study was supported by a grant-in-aid from Japan Society for the Promotion of Science; (JP22K17299). The funders had no role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Original Paper:

The article, "Regional variation in long-term care spending in Japan," was published in BMC Public Health at DOI: https://doi.org/10.1186/s12889-022-14194-6

Correspondence:

Visiting researcher JIN Xueying
Associate Professor IWAGAMI Masao
Professor TAMIYA Nanako

Research and Development Center for Health Services, Faculty of Medicine, University of Tsukuba

Related Link:

Faculty of Medicine

Research and Development Center for Health Services

 

On the fence: New research taps rancher expertise on living with carnivores

Peer-Reviewed Publication

S.J. & JESSIE E. QUINNEY COLLEGE OF NATURAL RESOURCES, UTAH STATE UNIVERSITY

Fence 

IMAGE: A WELL-DESIGNED FENCE CAN HELP TO PREVENT CONFLICTS WITH CARNIVORES, BUT WITH SO MANY OPTIONS FOR MATERIAL, PLACEMENT AND LOGISTICS, RESEARCHERS CAN STRUGGLE TO IDENTIFY WHAT STRATEGIES HAVE THE BEST CHANCE FOR SUCCESS. THEY TURNED TO RANCHERS FOR HELP. view more 

CREDIT: JAN CANTY

They say that good fences make good neighbors — especially true when you share space with gray wolves and grizzly bears.

In places like Wyoming and Idaho, ranchers have learned practical fencing strategies to help to reduce ill-fated encounters between hungry wildlife, vulnerable livestock and valuable produce. USU researchers are learning to take advantage of this hard-won knowledge, according to new research.

“Research about wildlife fencing is often missing on-the-ground knowledge,” said Julie Young of the Department of Wildland Resources and Ecology Center in the Quinney College of Natural Resources. “We wanted to reduce the cost and social burden of living with recovering wildlife populations, but we needed rancher input to do that.”

Given all possible options for fencing material, placement and logistics, the team wanted to zero in on strategies that had the best chance for success. They turned to the ranchers who have worked for decades in the “trenches” of wildlife conflict to help.

Young organized a group that included livestock producers, natural resource managers and university-based researchers. They met for four months — early in the morning to accommodate producers’ crack-of-dawn schedules. Participants were exposed to the reality of fencing designs and considerations across different scales: from hobby farms to orchard and apiary protection, to large cow-calf operations. The researchers learned about regulatory implications and obstacles to fencing on certain rangelands, which informed how they thought about adoption and the practicality of their research.

Once the research project began to take shape, they took the plan back to the ranchers for feedback.

“Our original design looked just at the effectiveness of fencing designs for preventing conflicts with agriculture or livestock. Concerns about human safety was something we initially overlooked,” said Rae Nickerson, coauthor on the research and a Ph.D. student in the Department of Wildland Resources and Ecology Center.

But fencing projects are often located near homestead areas, the researchers learned, and human safety was an issue important to the group.

“Some of the new things we learned from the process required flexibility in the process,” Nickerson said. “But it offered a unique way to prioritize our approach. It really took advantage of a diverse set of knowledge and experience.”

Researchers involved in creating preventative strategies for wildlife often view the issue from the perspective of the ecology of carnivores, but that’s not the sole priority of most producers. The researchers learned that they needed to integrate not just how and where fences were effective, but also how to make funding opportunities and paperwork more flexible for the producers. Ranching operations near increasing populations of large carnivores need information quickly, they said, before problems get out of control.

The group also planned strategies to get the word out about what ended up working, once the research was complete.

“Often the most promising and innovative tools aren’t circulated to managers and ranchers because they aren’t recorded or widely shared,” Young said. “The people who discover new and innovative tools to keep wildlife predators separate from livestock, grain storage and beehives often don’t have good ways to communicate their successes to others.”

Word-of-mouth can work, she said, but many of these folks are geographically separated from other producers confronted with the very same challenges. The team’s research will continue to look at the efficacy of using nonlethal tools to reduce wildlife conflicts and ways to disseminate best practices to more livestock owners.