Sunday, June 02, 2024

 

High groundwater depletion risk in South Korea in 2080s




POHANG UNIVERSITY OF SCIENCE & TECHNOLOGY (POSTECH)

Occurrence years of deep level groundwater depletion (unprecedented groundwater level) on the Korean Peninsula 

IMAGE: 

OCCURRENCE YEARS OF DEEP LEVEL GROUNDWATER DEPLETION (UNPRECEDENTED GROUNDWATER LEVEL) ON THE KOREAN PENINSULA

view more 

CREDIT: POSTECH




Groundwater is literally the water found beneath the Earth's surface. It forms when precipitation such as rain and snow seeps into the soil, replenishing rivers and lakes. This resource supplies our drinking water. However, a recent study has alarmed the scientific community by predicting that approximately three million people in currently untapped areas of Korea could face groundwater depletion by 2080.

 

A research team, led by Professor Jonghun Kam from Division of Environmental Science and Engineering and Dr. Chang-Kyun Park from the Institute of Environmental and Energy Technology (currently working for LG Energy Solution) at Pohang University of Science and Technology (POSTECH), used an advanced statistical method, to analyze surface and deep groundwater level data from 2009 to 2020, revealing critical spatiotemporal patterns in groundwater levels. Their findings were published in the international journal “Science of the Total Environment.”

 

Groundwater is crucial for ecosystems and socioeconomic development, particularly in mountainous regions where water systems are limited. However, recent social and economic activities along with urban development have led to significant groundwater overuse. Additionally, rising land temperatures are altering regional water flows and supplies, necessitating water policies that consider both natural and human impacts to effectively address climate change.

 

In a recent study, researchers used an advanced statistical method called “cyclostationary empirical orthogonal function analysis (CSEOF)” to analyze water level data from nearly 200 surface and deep groundwater stations in the southern Korean Peninsula from 2009 to 2020. This analysis helped them identify important spatiotemporal patterns in groundwater levels.

 

The first and second principal components revealed that water level patterns mirrored recurring seasonal changes and droughts. While shallow-level groundwater is  more sensitive to the seasonality of precipitation than the drought occurrence, deep-level groundwater is more sensitive to the drought occurrence than seasonality of precipitation.  This indicates that both shallow-level and deep-level groundwater are crucial for meeting community water needs and mitigating drought effects.

 

The third principal component highlighted a decline in groundwater levels in the western Korean Peninsula since 2009. The researchers projected that if this decline in deep groundwater continues, at least three million people in untapped or newly developed areas, primarily in the southwestern part of the peninsula, could face unprecedented groundwater level as a new normal (defined as groundwater depletion) by 2080. If the research team's predictions are correct, the impact would be particularly severe in drought-prone, untapped areas where groundwater is heavily relied upon.

 

Professor Jonghun Kam of POSTECH stated, "By leveraging long-term, multi-layer groundwater level data on Korea and advanced statistical techniques, we successfully analyzed the changing patterns of deep- and shallow-level groundwater levels and predicted the risk of groundwater depletion.” He added, “An integrated national development plan is essential, one that considers not only regional development plans but also balanced water resource management plans."

 

The research was sponsored by the Joint Research and Technology Development Project for Disaster Safety of the Ministry of the Interior and Safety and the Basic Research Program of the National Research Foundation of Korea.

 

Indian Ocean surface temperature could help anticipate dengue outbreaks



INSTITUT PASTEUR





Although dengue outbreaks cannot be prevented, it is possible to anticipate them. An international research team including scientists from the Institut Pasteur and Beijing Normal University in China has recently identified a global climate indicator that may help improve predictions about the magnitude of dengue outbreaks several months in advance. This indicator, which can be used for any world region, is based on temperature fluctuations at the surface of the Indian Ocean. Obtaining reliable long-term predictions could facilitate efforts to tackle this infection, which has been on the rise for several decades and threatens half of the world's population. The research results were published in the journal Science on May 10, 2024.

The ability to anticipate dengue outbreaks is crucial in planning control measures against the mosquitoes that transmit the disease and mobilizing hospital staff and equipment. And this is vitally important, because although vaccines have been developed, there is no specific treatment for the symptoms of this "tropical influenza," and the severity of outbreaks can vary hugely from one year to the next. "The dynamics of dengue are complex because the virus responsible for infection exists in four different forms, or serotypes, which can change from one year to the next and one country to the next," explains Simon Cauchemez, joint last author of the study and Head of the Mathematical Modeling of Infectious Diseases Unit at the Institut Pasteur. "So the magnitude of dengue outbreaks can vary considerably from one season to the next."

We now know that the reproduction and infectivity of Aedes genus mosquitoes that transmit dengue viruses, and by extension the transmission rate and scale of outbreaks, are closely correlated with local temperature and rainfall. But these parameters can only be predicted between two weeks and three months in advance, and the quality of forecasts decreases rapidly for longer-term predictions. Global climate indicators like El Niño–Southern Oscillation (ENSO) can generally be predicted over a longer period, beyond six months. An international team therefore decided to study 30 global climate indicators to determine whether monitoring such indicators could help predict dengue outbreaks further in advance.

The team compiled two large datasets: the total number of annual dengue cases reported in 46 countries in South-East Asia and America over 30 years (1990-2019), and also the monthly number of dengue cases in 24 countries over six years (2014-2019). They found that of all the indicators under consideration, the Indian Ocean basin-wide (IOBW) index, which measures temperature fluctuations at the surface of the Indian Ocean, was the global indicator that was most closely correlated to annual dengue incidence in both the northern and southern hemispheres.

“Our results revealed that incorporating the IOBW index into our mathematical model resulted in predictions that closely matched real-world data, compared to the model excluding the IOBW index,” notes Huaiyu Tian, co-last author of the study and Director of the Center for Global Change and Public Health, Beijing Normal University. “The extended lead time and improved predictive ability underscore the significance of the IOBW index in dengue forecasting and early warning systems.”

These interesting theoretical findings will need to be validated in real conditions. "Our aim is to develop predictive models for dengue outbreaks in Guadeloupe, French Guiana and Martinique, and we will now explore whether the IOBW index can effectively improve these predictions," explains Simon Cauchemez. "Climate is not the only factor that influences dengue outbreaks. For these predictive models we will also need to take into account other factors like immunity levels in the population, strains previously in circulation, etc."

If the findings are borne out, the new indicator could help improve the prediction of dengue outbreaks, thereby also improving efforts to tackle this infectious disease, which affects 50,000 million people each year. The incidence of dengue has risen dramatically in recent years, especially in mainland France. Between January 1 and April 19, 2024, 1,679 imported dengue cases were reported to Santé publique France, compared with 131 over the same period in 2023.

 

Risk of death from COVID-19 lessens, but infection still can cause issues 3 years later



Study also shows that patients hospitalized within 30 days after infection face 29% higher death risk in 3rd year compared with those not infected



WASHINGTON UNIVERSITY IN ST. LOUIS





New findings on long COVID — long-term effects on health experienced by many who have had COVID-19 — present a good-news, bad-news situation, according to a study at Washington University School of Medicine in St. Louis and the Veterans Affairs St. Louis Health Care system.

The bad news: COVID-19 patients who were hospitalized within the first 30 days after infection face a 29% higher risk of death in the third year compared with people who have not had the virus. However, the three-year death risk still marks a significant decline compared with such risk at the one- and two-year marks post-infection. The findings also show that even people with mild COVID-19 were still experiencing new health problems related to the infection three years later.

The good news: The increased risk of death diminishes significantly one year after a SARS-CoV-2 infection among people who were not hospitalized for the virus. This demographic accounts for most people who have had COVID-19.

The new research, published May 30 in Nature Medicine, tracked the virus’s health effects in people three years after being infected with the original strain of COVID-19 in 2020. That year, about 20 million people tested positive for the virus in the U.S. The new study assessed the risk of death and 80 adverse health conditions in people three years after being diagnosed with COVID-19.

“We aren’t sure why the virus’s effects linger for so long,” said senior author Ziyad Al-Aly, MD, a Washington University clinical epidemiologist and a global leader in long COVID research. “Possibly it has to do with viral persistence, chronic inflammation, immune dysfunction or all the above. We tend to think of infections as mostly short-term illnesses with health effects that manifest around the time of infection. Our data challenges this notion. I feel COVID-19 continues to teach us — and this is an important new lesson — that a brief, seemingly innocuous or benign encounter with the virus can still lead to health problems years later.”

Up to 10% of people infected with the virus experience long COVID, according to federal data.

Al-Aly’s prior research has documented COVID-19’s damage to nearly every human organ, contributing to diseases and conditions affecting the lungs, heart, brain, and the body’s blood, musculoskeletal and gastrointestinal (GI) systems.

Such studies with longer follow-up are limited, said Al-Aly, a nephrologist who treats patients at the Washington University-affiliated John J. Cochran Veterans Hospital in midtown St. Louis. “Addressing this knowledge gap is critical to enhance our understanding of long COVID and will help inform care for people suffering from long COVID.”

Al-Aly and his team analyzed millions of de-identified medical records in a database maintained by the U.S. Department of Veterans Affairs, the nation’s largest integrated health-care system. The study included more than 114,000 veterans with mild COVID-19 who did not require hospitalization; more than 20,000 hospitalized COVID-19 patients; and 5.2 million veterans with no COVID-19 diagnosis. Patients were enrolled in the study from March 1, 2020, to Dec. 31, 2020, and followed for at least three years, until Dec. 31, 2023. Patients included people of diverse ages, races and sexes; statistical modeling ensured parity in representation.

In the third year after infection, COVID-19 patients who had been hospitalized experienced a 34% elevated health risk across all organ systems compared with people who did not have COVID. That number is down from a 182% increased risk one year after a COVID infection and a 57% risk two years after.

Among nonhospitalized patients, researchers found a 5% increased risk in suffering from long COVID in the third year after infection. This translates into 41 more health problems per 1,000 persons – a small but not trivial burden. The long-term health effects in the third year primarily affected the GI, pulmonary and neurological systems. By comparison, the risk was increased by 23% one year after infection and increased by 16% two years after.

In the analysis, researchers also measured and compared the number of healthy life-years lost due to COVID-19. They found that among the nonhospitalized, at three years after infection, COVID-19 had contributed to 10 lost years of healthy life per 1,000 persons. By comparison, three years post-infection, those hospitalized for COVID-19 had experienced 90 lost years of healthy life per 1,000 persons.

For context, in the U.S., heart disease and cancer each cause about 50 lost years of healthy life per 1,000 persons, while stroke contributes to 10 lost years of healthy life per 1,000 persons.

“That a mild SARS-CoV-2 infection can lead to new health problems three years down the road is a sobering finding,” said Al-Aly, who is also director of the Clinical Epidemiology Center at the VA St. Louis Health Care System, and head of the research and development service. “The problem is even worse for people with severe SARS-CoV-2 infection. It is very concerning that the burden of disease among hospitalized individuals is astronomically higher.”

“COVID-19 is a serious threat to the long-term health and well-being of people and it should not be trivialized,” he said.

The extended trajectory for long COVID may change as researchers incorporate data from years beyond 2020. At that time, vaccines and antivirals had not been developed. Similarly, Al-Aly’s analysis does not consider subsequent variants such as omicron or delta.

“Even three years out, you might have forgotten about COVID-19, but COVID hasn’t forgotten about you,” Al-Aly said. “People might think they’re out of the woods, because they had the virus and did not experience health problems. But three years after infection, the virus could still be wreaking havoc and causing disease or illness in the gut, lungs or brain.”

 

A multimodal approach to better predict recovery in patients with disorders of consciousness



INSTITUT DU CERVEAU (PARIS BRAIN INSTITUTE)
Searching for signs of consciousness 

IMAGE: 

SEARCHING FOR SIGNS OF CONSCIOUSNESS

view more 

CREDIT: NICOLAS DECAT.





After a severe cranial trauma or cardiac arrest, some patients admitted to intensive care show little or no reaction to their environment—and are sometimes unable to communicate. This condition is called a disorder of consciousness (DoC), which includes comas, vegetative states and states of “minimal consciousness.”

This disorder sometimes persists for several days or weeks. In such cases, healthcare teams and relatives must obtain the most accurate information on the patient's cognitive recovery capacities. Usually, a neurological prognosis is established using several indicators—including standard measurements of brain anatomy (CT and MRI scans) and function (electroencephalogram).

Despite having these data at our disposal, there remains a degree of uncertainty about the prognosis, which can significantly impact medical decision-making. These patients are often in a fragile state and prone to numerous complications, which raises questions about the appropriateness of the care they receive,” explains Benjamin Rohaut, neurologist, researcher and lead author of the study. “Moreover, doctors sometimes observe a discrepancy between the patient's behaviour and their brain activity: some patients in a vegetative state seem to understand what is being said to them but are unable to let their caregivers know.”

To improve the description of the state of consciousness of these patients, the “PICNIC team”, co-led by Lionel Naccache at the Paris Brain Institute, has been working for around fifteen years to define new brain measurements and clinical examination signs. Their approach has gradually evolved towards “multi-modality”, combining PET scans, multivariate EEG analysis, functional MRI, cognitive evoked potentials (electrical responses to sensory stimulation) and other tools.

Consciousness markers under scrutiny

To assess the clinical value of this approach, the team worked with the “Neurologically Oriented Intensive Care Unit” at the Pitié-Salpêtrière Hospital in Paris. Led by Benjamin Rohaut and Charlotte Calligaris, the clinicians and researchers followed and assessed 349 intensive-care patients between 2009 and 2021. At the end of each multimodal evaluation, they formulated a “good”, “uncertain”, or “unfavourable” prognostic opinion.

Their results indicate that patients with a “good prognosis” (22% of cases) showed a much more favourable evolution of their cognitive abilities than patients with a prognosis judged “uncertain” (45.5% of cases) or “unfavourable” (32.5% of cases); none of the patients assessed as “unfavourable” regained consciousness after one year. Above all, this prognostic performance was correlated with the number of modalities: the greater the number of indicators used, the greater the accuracy of the prognosis, and the greater the team's confidence in its assessments.

This long-term study shows for the first time the benefit of the multimodal approach, which is essential information for intensive care units worldwide. It also provides empirical validation of the recent recommendations of the European and American Neurology Academies, " explains Jacobo Sitt, who co-supervised this study. 

Towards a standardised neuroprognostic approach

However, the multimodal approach is not a magic wand. It provides the best possible information to caregivers and families in situations of uncertainty—an ethical advance in patient care—but does not guarantee bias-free decision-making.

Finally, there is the question of access to assessment tools, which are expensive and require specific expertise. “We are aware that multimodal assessment is not accessible to all the intensive care units that receive these patients”, continues Lionel Naccache. “We propose to build a network of collaborations at the national and European levels. Thanks to telemedicine tools and automated EEG or brain imaging analysis, all intensive care units could have a first level of access to multimodal assessment. Should this prove insufficient, recourse to a regional expert centre would provide a more in-depth assessment. Finally, in the most complex situations, it would be possible to call on all available experts, wherever they may be. We aim to ensure that all patients with a disorder of consciousness can benefit from the highest standards of neurological prognosis.”

 

HudsonAlpha researchers create valuable genomic tools for the cotton industry



HUDSONALPHA INSTITUTE FOR BIOTECHNOLOGY





We live in an ever-changing and growing world. Changing climates, emerging pests, and other environmental stressors put pressure on the cash crops that feed and fuel the world. As we race to meet the growing demand for sustainable and high-quality food and fiber crops, genomics is emerging as a powerful tool in the fight. By understanding plants’ genetic codes, researchers and breeders can develop crops with increased yields, improved resistance to pests and diseases, and greater adaptability to environmental challenges. 

Genome-informed breeding primarily benefits crops with existing high-quality genomic resources, like rice and wheat. However, crops with less mature genomic resources must continue to rely on traditional breeding methods, which sometimes suffer due to a lack of genomic diversity within the breeding populations. 

Cotton, a vital cash crop worldwide, lacks robust genomic resources. The cotton industry is big business, with a global economic impact of $600 billion and providing jobs for more than 250 million people. Successful cotton production relies on cotton varieties with desirable characteristics like high yield, good fiber quality, pest and disease resistance, and drought tolerance. 

“Cotton breeders have improved fiber yield and quality over the years using traditional breeding methods,” says Jeremy Schmutz, co-director of the HudsonAlpha Genome Sequencing Center, who has been working on cotton genomics for over a decade. “Achieving additional improvements may be difficult for them due to the lack of genetic variation across modern domesticated cotton. Creating new genomic tools for the industry will help take cotton improvements to the next level.” 

Scientists at the HudsonAlpha Institute for Biotechnology Genome Sequencing Center (GSC) and other collaborators set out to create high-quality genome sequences for three important cotton varieties, providing necessary genome resources for cotton breeders. The results were recently published in Nature Plants

“Cotton research has relied heavily on one reference genome, ‘TM1’, a variety of cotton that is no longer widely used in breeding programs,” says Avinash Sreedasyam, PhD, first author of the manuscript. “In order for molecular breeding to benefit the cotton industry, many, varied genomes must exist to represent the diversity of cotton varieties. This study generated high-quality reference genomes for three modern upland cotton cultivars and updated the ‘TM-1’ cotton genetic standard reference.” 

Initial analysis of the new reference genomes produced important information about fiber quality. The highly accurate and complete genome assemblies were used to identify genetic material from Pima cotton (known for superior fiber quality) within modern cotton varieties. Small segments of each genome were compared to both Pima and the reference cotton genome. 

Segments that matched Pima more closely than the reference cotton were classified as potential introgressions, suggesting Pima DNA had been incorporated into the modern cotton's genetic makeup. Knowledge of these Pima introgressions will help breeders to efficiently select progeny with these fiber-quality linked genetic markers in their breeding programs.  

“Leveraging relatively inexpensive low-pass sequencing alongside these genomes empowers breeders to select progeny rapidly,” says Sreedasyam. “This will not only save time but also reduce costs associated with traditional fiber phenotyping, a laborious process usually requiring hundreds to thousands of samples per breeding cycle.” 

These findings highlight the significance of using detailed genome assemblies to uncover genetic variations that can improve cotton breeding programs. The more these new, high-quality genomes are used for comparative studies, the more information about economically important cotton traits will emerge. The genomic resources described in this study represent a valuable addition to the cotton breeding toolkit and will reap benefits for years to come. 

Collaborators on this project include Don C. Jones, Cotton Incorporated, NC; Peng W. Chee, University of Georgia, Tifton, GA; Warwick N. Stiller, CSIRO, Cotton Research Unit, Australia; and Fred Bourland, University of Arkansas, Keiser, AR.

This work is supported by Cotton Incorporated (Award 18-753) and the intramural research program of the US Department of  Agriculture,  National Institute of  Food and Agriculture  Foundational and Applied Science Program Award 2022-67013-36899. The findings, conclusions, or recommendations expressed here have not been formally disseminated by the US Department of Agriculture and should not be construed to represent any agency determination or policy.

Byline: Sarah Sharman, PhD

 

Statisticians call for rigour and transparency in the evaluation of diagnostic tests



UNIVERSITY OF BIRMINGHAM





Recommendations designed to reframe the evaluation of in vitro diagnostic tests have been published today by the Royal Statistical Society in its Series A journal.

The report, which will be submitted to the UK Covid-19 Inquiry, is intended to help prevent future scenarios in which IVDs are marketed widely, but later attract serious concerns about the standards applied to their evaluation.

The research was prompted by concerns about the standards applied to the evaluation of diagnostic tests during the Covid-19 pandemic – particularly lateral flow tests – however the recommendations cover all new tests, especially those designed to detect infectious diseases.

It is published today in the RSS’s Series A journal and also presented at the Evidence Based Early Diagnosis conference at St Andrews.

A working group of statisticians, co-chaired by Professor Jon Deeks, at the University of Birmingham, and former RSS President, Professor Deborah Ashby, at Imperial College London, collaborated on the report with co-authors from the Universities of Oxford, Cambridge, Edinburgh, Birmingham and the London School of Hygiene and Tropical Medicine.

The RSS Working Group on Diagnostic Tests set out 22 recommendations, designed to ensure that in vitro diagnostic (IVD) tests – which typically test samples of fluids such as blood, urine or saliva – are statistically robust and fit for purpose. The RSS Working Group identified Study-Design matters (10 recommendations); Regulation matters (6 recommendations); Transparency matters (6 recommendations).

Jon Deeks, Professor of Biostatistics at the University of Birmingham, said:

“The Covid-19 pandemic provided a microcosmic insight into inadequacies in current processes to evaluate and regulate diagnostic tests. It’s important that we learn from these failures and establish robust processes that can be applied broadly across diagnostic tests.”

The report covers three areas of diagnostic testing: study-design of evaluations; regulation of tests; and transparency of test evaluations.

Key recommendations included:

  • Evaluation needs to take into account each specific intended use of the test, including the person being tested, the target condition and even the facilities where the testing will be done. Field or clinical evaluation studies should be carried out for each intended use.
  • Direct comparison of alternative IVDs and testing strategies should be available to inform clinical and public health decision-making.
  • The Medicines and Healthcare products Regulatory Agency (MHRA) should collaborate with independent experts to revise the national licensing process for IVDs. This will ensure public safety is protected. Protocols and reports for test evaluations should be publicly available to ensure transparency in all planning and decision-making.

The publication of the report is relevant for the opening of the ‘Test, Trace and Isolate’ module of the UK Covid-19 Inquiry. It also coincides with the MHRA’s recently-launched consultation on improved safety for high-risk diagnostic devices.

Professor Sheila Bird at the MRC Biostatistics Unit at the University of Cambridge, said:

“Past Royal Statistical Society Working Party reports on matters which affect the public health have had enduring impact. Official Statistics – Counting with Confidence led to the UK Statistic Act of 2007; Statistics and Statisticians in Drug Regulation led to the appointment of professional statisticians by the UK, and later, European drug regulator; Statistical Issues in First-in-Man Studies led to safety-enhanced study-designs with open protocols. I hope that this month’s consultation by MHRA is indicative that Diagnostic Tests is making its mark already.”

Dr Andrew Garrett, President of the Royal Statistical Society, said:

“The report provides a thorough evaluation of both diagnostic tests and diagnostic testing.  It addresses how to develop, regulate, and use diagnostic tests in the future – a subject that is of increasing importance to individual and public health.”

 

EU citizens feel increasingly European


Research reveals that the sense of European identity among inhabitants of most EU countries has increased in the past 15 years


UNIVERSITEIT VAN AMSTERDAM





The sense of a European identity has increased among inhabitants of the European Union in the past 15 years – in spite of crises like Brexit and the Eurozone crisis. That is the conclusion of professor of European Studies Theresa Kuhn in a recent publication. ‘The euro and open borders have made the EU tangible.’

With the European elections just around the corner, one question is more relevant than ever: to what extent do the inhabitants of Europe feel a sense of connection with the European Union? And where does that come from? ‘We wanted to map out how European identity has evolved over the years,’ says Theresa Kuhn, ‘but our research into this was severely limited by the types of opinion polls that we usually consult for our research. Most polls don’t date back very far and usually only one type of question was asked on the subject.'

In order to rectify this problem, Kuhn and her team combined various opinion polls from dozens of countries over a period of 41 years. ‘We subsequently applied a calculation to this, as a result of which we now have information about the development of European identity since the 1980s.’

Crisis has a reinforcing effect

The research reveals that the sense of European identity among inhabitants of most EU countries has increased in the past 15 years. ‘That surprised us to be honest’, says Kuhn. ‘The past two decades were marked by crises, not only externally, but also internally, such as Brexit and the Eurozone crisis. You might expect that this would make people want to distance themselves from the European Union, but that does not appear to be the case. One explanation for this could be that people are more inclined to adhere to a group as a result of a crisis. People feel threatened and are more likely to surround themselves with people whose views align closely with their own.’

The fact that an entire generation has only ever known the European Union to be part of Europe may be another explanation for this according to Kuhn. ‘This group of people have grown up in an era in which there are open borders and many countries have the euro as currency unit. These things have also made Europe something tangible, as a result of which people have been able to experience the EU, rather than it being an abstract institution.'

It is the case, however, that there are major differences in the development of European identity from country to country. ‘In Northern and Western European countries, the sense of a European identity is usually more prevalent, while South and Central Europe exhibit a more diverse development. For example, you see that Italians have become increasingly less pro-European over the years, whereas Spain has actually experienced growth in that regard.’

Euroscepticism

A lot of Eurosceptic parties are poised to make gains according to polls for the upcoming European elections. In Kuhn’s opinion, however, that does not mean that people also feel less European. ‘A clear differentiation needs to be made between the perception of a European identity and support for the European Union. Someone may feel European, but not agree with the current policy. The opposite may also be true.’

She also argues that the popularity of the Eurosceptic parties does not automatically mean that people are less pro-Europe at present. ‘Eurosceptic voters have most likely existed since the 1950s. However, they didn’t have any way yet to express that electorally, because almost all parties were pro-European at national level. Parties have only decided to make this an issue since the 1990s.

In her research, Kuhn argues that the European Union should do even more to strengthen a sense of European identity. ‘Many important decisions are taken at European level. That’s why it is important for the democratic legitimacy of the EU that a significant proportion of Europeans also feel connected to Europe. In addition, research shows that people who identify as European are less likely to vote for populist parties and more likely to show solidarity for other Europeans.

Report

'A community of fate: growing european identity in times of polycrisis. Exploring Long-Term Trends of European Identity Across the European Member States.',  By Theresa Kuhn, Armin Seimel, and John M. Michaelis, University of Amsterdam

 

Researchers discover that a type of childhood leukaemia originates during foetal development


JOSEP CARRERAS LEUKAEMIA RESEARCH INSTITUTE
Pablo Menéndez, Talía Velasco and Oscar Molina, authors of the study and researchers of the Josep Carreras Leukaemia Research Institute 

IMAGE: 

PABLO MENÉNDEZ, TALÍA VELASCO AND OSCAR MOLINA, AUTHORS OF THE STUDY AND RESEARCHERS OF THE JOSEP CARRERAS LEUKAEMIA RESEARCH INSTITUTE 

view more 

CREDIT: JOSEP CARRERAS LEUKAEMIA RESEARCH INSTITUTE





Acute myeloid leukaemia is the second most common type of acute leukaemia in childhood and can be diagnosed within a few months of life. The early onset of the disease had led to the suspicion that the tumour could have a prenatal origin. However, proving this theory has been challenging due to the lack of prenatal or birth samples.

‘The opportunity to study the origin of this leukaemia arose from the case of a five-month-old baby diagnosed with acute myeloid leukaemia at the Hospital Niño Jesús in Madrid,’ explains Pablo Menéndez, ICREA professor at the University of Barcelona and the Josep Carreras Institute. ‘The parents, who had preserved the umbilical cord blood, opened a line of research that until now had not been possible to address,’ adds the researcher.

Using precision medicine techniques, researchers analysed the complete genome of the tumour. Unlike tumours in adults, where thousands of mutations are detected, only two chromosomal alterations were identified in this leukaemia. ‘Genome analysis allowed us to design a personalised diagnostic method to monitor the disease,’ says Xose S. Puente, Professor of Biochemistry at the University of Barcelona. Puente, Professor of Biochemistry and Molecular Biology at the University of Oviedo. ‘But these data raise new questions, such as when the tumour arose and in what order these mutations have appeared,’ he highlights. These questions are difficult to answer, since such research requires blood samples from the baby before the diagnosis, something that is impossible in the vast majority of cases. However, in this particular case, the existence of a frozen umbilical cord sample allowed researchers to separate different populations of blood cells at birth and to study whether any of the chromosomal alterations detected in the tumour were already present during foetal development.

The study revealed that a translocation between chromosome 7 and 12 was already present in some haematopoietic stem cells in the umbilical cord. In contrast, the other chromosomal alteration, a trisomy of chromosome 19, was not present in the foetus, but was found in all tumour cells, suggesting that it contributes to increasing the malignancy of the leukemic cells. ‘These data are highly relevant for understanding the development of a devastating disease, and the existence of this umbilical cord sample was crucial to be able for conducting a study that had been impossible until now in acute myeloid leukaemia’, adds Talía Velasco, researcher at the Josep Carreras Institute and the University of Barcelona and co-leader of the study.

In addition to reconstructing the genomic alterations that the cells undergo to generate this leukaemia, the study has also identified a molecular mechanism that had not been observed before in this type of leukaemia and which causes the activation of a gene, called MNX1, which is frequently altered in this type of tumour. Knowledge of these alterations is essential for developing cell and animal models that allow us to understand the disease’s evolution and develop new therapies for treating these pathologies.

The study has been led by Xose S. Puente, Professor of Biochemistry and Molecular Biology at the University of Oviedo-IUOPA, Talía Velasco and Pablo Menéndez, from the Josep Carreras Institute and the University of Barcelona, with participation from researchers from four other institutions, including the Hospital Infantil Universitario Niño Jesús, the Hospital Universitario Central de Asturias, the Instituto de Biomedicina y Biotecnología de Cantabria and the Instituto de Investigación Sanitaria La Princesa de Madrid.

This research has been made possible thanks to the collaboration of the parents and funding from the Ministry of Science, Innovation and Universities, the European Research Council, the AECC Scientific Foundation, the Foundation Unoentrecienmil, the “La Caixa” Foundation, the Government of Catalonia, CIBERONC and the III Health Institute.