Monday, August 11, 2025

 

Parental incomes drop if their child is diagnosed with type 1 diabetes




Uppsala University
Beatrice Kennedy 

image: 

Beatrice Kennedy, physician at the Endocrine and Diabetes unit at Uppsala University Hospital and Associate Professor of Medical Epidemiology at Uppsala University

view more 

Credit: Danish Saroee





Parents of children diagnosed with type 1 diabetes suffer an income drop in the years following the diagnosis. The impact is more pronounced in mothers, especially mothers of children diagnosed in preschool years. These novel findings from a study led by researchers at Uppsala University, Sweden, have now been published in Diabetologia, the Journal of the European Association for the Study of Diabetes (EASD).

Type 1 diabetes is a chronic autoimmune disease that requires daily insulin treatment and continual blood sugar monitoring. In Sweden, more than a thousand children are diagnosed with type 1 diabetes yearly. The parents of these children shoulder the main responsibility for treatment and monitoring, at home as well as in school settings. Previous research has shown that parents of children with type 1 diabetes are at increased risk of stress-related symptoms and may need to reduce their working hours.

“In our study, we observed reduced parental work-related incomes in the years following the child’s type 1 diabetes diagnosis. The drop was larger in mothers than in fathers. Since mothers earned significantly less than fathers in absolute terms, even before the child fell ill, the relative drop in mothers was 6.6% the year following diagnosis compared to 1.5% in fathers. We further note the greatest impact on work-related incomes in mothers of children diagnosed at preschool age,” says Beatrice Kennedy, physician at the Endocrine and Diabetes unit at Uppsala University Hospital and Associate Professor of Medical Epidemiology at Uppsala University, who led the study. 

Builds on data from 26,000 parents of children with type 1 diabetes

The research project was an interdisciplinary collaboration across medical departments at Uppsala University, including the Centre for Health Economics Research, with the University of Gothenburg. It builds on data from national population and health registers and the Swedish Child Diabetes Register (Swediabkids). The study includes the parents of more than 13,000 children diagnosed with type 1 diabetes in Sweden in 1993−2014, as well as more than half a million parents in the general population who have children not diagnosed with diabetes.

The researchers observed that the maternal pension-qualifying incomes (a composite outcome including work-related income and societal benefits) initially increased after the child’s diagnosis. This was attributable to mothers applying for the parental care allowance from the Swedish Social Insurance Agency. The parental care allowance was intended to compensate for disease-related loss of work-related income and contribute toward disease-specific costs.

Need for increased support to parents of children with chronic disease

When the research team investigated long-term effects in mothers, they found that the pension-qualifying incomes gradually decreased after eight years, and had not recovered by the end of follow-up − 17 years after the children were diagnosed.

“In this study, we have focused on the income effects in parents of children diagnosed with type 1 diabetes. However, parenting children with other chronic childhood-onset conditions may entail similar income consequences. Our findings indicate that there is room for improvement in the societal targeted support to mothers of children with chronic conditions, to ensure that the financial impact of caring for a child with health concerns is alleviated,” says Tove Fall, Professor of Molecular Epidemiology at Uppsala University, who initiated the project. 

Proving presence in a distrustful world: How Estonian researcher’s work could transform digital truth




Estonian Research Council
Eduardo Brito 

image: 

Eduardo Brito

view more 

Credit: University of Tartu





The work of University of Tartu intersectoral doctoral researcher Eduardo Brito could reshape how we verify truth in the digital world. His research on a decentralised Proof-of-Location system, recently published in Nature Scientific Reports, offers a new way to cryptographically prove where and when something happened, without compromising privacy or relying on centralised authorities.

In an era of synthetic media, GPS spoofing, and rising digital distrust, how can we prove that something truly happened in a specific place at a specific time? In 2023, millions of people and even media outlets like Time, Fox Business and CBS News shared AI-generated photos of Pope Franciscus wearing a white Balenciaga puffer jacket. This forced the Pope himself and the Vatican City to publicly comment on the dangers of AI-generated deepfakes, referencing how such tools can distort reality and truth.

In the near future, it might be much harder to fool people with manipulated media. In his recent article published in Nature Scientific Reports, Eduardo Brito, an intersectoral doctoral researcher at the University of Tartu Institute of Computer Science, in collaboration with Cybernetica AS, introduces the world’s first unified, system-level architecture for decentralised Proof-of-Location systems, capable of cryptographically verifying a person’s or device’s physical presence at a specific time and place.

What is Proof-of-Location?

Proof-of-Location (PoL) is a system that generates a secure, verifiable digital proof of physical presence. "Imagine someone says, 'I was at this place at this time.' PoL makes it possible to verify that claim without simply trusting their word or revealing unnecessary private data. It’s a secure, verifiable stamp of presence that others can check without relying on a single company or government," Brito explains.

While GPS and Wi-Fi triangulation can estimate location, they fall short when it comes to tamper-resistance, privacy, and decentralisation. Brito's model uses a network of nearby devices, called witnesses, to co-sign location claims. Each witness cryptographically attests to the presence of a device (the prover) at a certain time and place. These attestations can then be verified by third parties without involving a centralised authority. 

When photos lie

Brito's inspiration came from the growing sophistication of AI-generated media. "It made me wonder: if we can no longer trust our eyes, how do we protect the truth in the digital world?" That concern soon extended to physical-world events, like presence and location. The result is a PoL system with wide-reaching applications:

  • Content authentication: Verifying the time and place where photos or videos were captured
  • Supply chain tracking: Confirming the integrity and location of goods during transit
  • Digital alibis: Supporting legal disputes or compliance audits
  • Civic participation: Enabling secure, anonymous presence verification for voting

 

Resisting misuse

Unlike centralised systems, Brito’s decentralised PoL architecture includes several safeguards to prevent misuse – such as involuntary witnessing or mass surveillance. “We can’t make misuse impossible, but we sure can make it much harder,” Brito notes.

Cryptographic consent through consensus means that devices must opt in before participating as witnesses. Without explicit participation, they cannot sign or contribute to proofs. Secondly, proofs reveal only what is necessary for verification – no raw data that could expose bystanders is shared. The decentralised trust model ensures that there is no single authority collecting or storing all proofs, and no central database that could be exploited for surveillance. The system is not built to rely purely on policy or goodwill; it has been designed from the ground up to make misuse technically more difficult.

 

Challenges in adoption

Despite its potential, decentralised PoL systems may face resistance from multiple fronts. "Legacy systems are deeply embedded and expensive to replace," Brito says. Entities that profit from controlling location data, such as large tech firms or surveillance-heavy governments, may be reluctant to adopt decentralised alternatives.

He anticipates early adoption in content authentication and logistics, where unreliable location data already causes real-world problems. Civic uses and infrastructure auditing may follow once regulatory frameworks mature.

Looking ahead

The research team is now scaling prototypes and testing real-world deployments. Initial conversations with governments, NGOs, and private companies are underway, exploring use cases in journalism, digital asset verification, and critical infrastructure.

"Five years from now, I see PoL quietly integrated into many of the systems we already use," Brito says. "It might support trust in digital media without most users even noticing. In law, it could become an accepted form of evidence. And in crises, from natural disasters to conflicts, it could help verify on-the-ground information quickly and reliably."

Open-sourcing the protocol is part of the roadmap, ensuring that transparency and community scrutiny remain central to the project.

In a world increasingly shaped by digital uncertainty, Brito's decentralised Proof-of-Location offers a promising way forward – one built not on blind trust, but on cryptographic truth.

 

How agile is your crypto? New interview study explores the opportunities and challenges of cryptographic update processes




CISPA Helmholtz Center for Information Security
CISPA interview study on cryptographic update processes 

image: 

A qualitative interview study by CISPA researchers explores the possibilities and challenges of cryptographic update processes

view more 

Credit: CISPA






If you think of software as a building, you might say it’s made up of code blocks. Many of these building blocks are custom-built for a specific application; others arrive on site pre-cast, because they’re standard components and used in many buildings. Cryptographic algorithms and functions are a prime example of this. They’re built into software to encrypt data and communication flows. But as these building blocks age and become brittle, the security of the entire application degrades. In a qualitative interview study with 21 international participants, CISPA researcher Alexander Krause explored the challenges faced by experienced software developers when they want to renew existing crypto implementations—or even create better cryptographic building blocks from scratch. The CISPA study will be presented on August 14, 2025, at the Usenix Security Symposium in Seattle, USA.

Crypto Agility—or: Why does crypto become outdated?

Cryptographic algorithms are fundamental building blocks in the development of new applications. They ensure that data and information can be communicated in encrypted form, reliably protected from the prying eyes of unauthorized third parties. Unlike most other code sequences, certain cryptographic implementations lose their effectiveness over time: As other technological fields advance, for example if computers significantly gain processing power, asymmetric encryption can potentially become vulnerable. Quantum computing is a textbook example of this, as CISPA researcher Alexander Krause explains: “If connections are encrypted with TLS, those data streams can’t be decrypted yet—but it’s very likely that this will be possible in the future. Quantum computers will be able to compute far more efficiently, because they’re not just using the binary states 0 and 1, but the three states 0, 1, and 01 simultaneously.” Computing with three possible states enables quantum machines to solve mathematical problems much faster, and to use new, more efficient algorithms that aren’t available on “conventional” computers. 

Updating cryptographic implementations is thus a recurring task—and one with far-reaching implications for software users. If crypto updates go awry, the consequences for overall software security can be severe. In this context, Krause refers to the concept of “crypto agility”: “This recurring update process for cryptographic implementations ideally begins with something called ‘crypto agility.’ It means that when developers are designing a software, they already keep in mind that they may need to replace or update the cryptographic implementation at some point in the future.” Thinking ahead in this way is meant to facilitate updating the software later on with state-of-the-art cryptographic methods. However, executing crypto updates requires highly specialized knowledge that many software developers do not possess. 

Crypto libraries require maintenance

Cryptographic implementations tend to come from publicly accessible, free crypto libraries that are maintained by specialized developer communities. These open-source projects, which benefit developers around the world, are usually supported by just a handful of individuals who contribute their time on a volunteer basis. “It’s a fundamental principle of software development that developers reuse existing components unless they need a customized solution”, says Krause. “This also means that I don’t write a new implementation for a cryptographic standard from scratch each time—I import a library in the programming language I’m using that already provides the required function.” While reusing existing algorithms and functions makes for efficient programming, it also introduces unique security risks where cryptography is concerned. If crypto libraries are not properly maintained and bugs go unfixed, those vulnerabilities can proliferate across a wide range of applications. In the context of the “supply chain”—that is, a kind of dependency of software projects from other resources—this creates what’s known as a “single point of failure.” If a crypto library is not reliably maintained, it can jeopardize the functionality of all products that rely on it within the supply chain.

How do you recruit expert populations for a study? With hard work

Conducting a qualitative interview study with 21 participants, Alexander Krause and his CISPA colleagues have explored the challenges that software developers, who usually aren’t crypto experts themselves, face when updating cryptographic implementations. Their goal was to find answers to four narrowly defined research questions: How do developers learn about a recommended crypto update? What goals do they pursue with the update? What processes do they follow when planning and executing a crypto update? And finally, what experiences did they gain when carrying out those updates? “There’s already a lot of research on updating software projects in general”, says Krause. “But here, we wanted to explore whether expert populations with highly specialized knowledge have unique requirements, too.”

Recruiting participants for the study was a major challenge. “It was tough to gather those 21 developers—it took a lot of effort”, Krause explains. “We only included experienced developers, and we assessed their experience based on the contributions they already made to software projects.” In addition to reaching out through their professional network, the researchers posted their call for participants on Upwork and contacted many other potential candidates via email. The email recruitment was especially time-consuming, as it required extensive online research to find publicly available contact information for suitable participants. Krause estimates the response rate for the email campaign was only about 1 percent. “People took part in the study for different reasons”, he summarizes. “Some were intrinsically motivated because they saw the research as important and wanted to support it. Others felt personally acknowledged—they said: ‘Oh, you looked at my GitHub code and my project. It’s great that you took notice of my work.’”

Heterogeneous results: Crypto updates are context-dependent

One of the key findings of the interview study is that the information flow around recommended crypto updates is inconsistent and sometimes incomplete. Updates were primarily triggered by information that developers received through sources like blogs, social media, and GitHub. However, depending on their institutional affiliation, some developer groups are more likely to receive information about updates than their colleagues. “If you work for a large company, there are often agreements. They often receive advance notice of vulnerabilities and can be the first to patch them—for example as part of a disclosure process. This information is passed on through private mailing lists that only a few people have access to”, Krause summarizes. “A big takeaway for us was how hard it is to get into these communities. Someone who wants to get started now, how do they get connected? How do they get onto one of these lists?” 

The interview study also revealed that there rarely are established, structured processes to manage crypto updates in companies or projects. Prioritization of such updates sometimes depended on available resources such as team size. Decision-making processes and responsibilities around crypto updates were also at times unclear. “That was a negative surprise for us”, Krause says. “Who decides who’s responsible for a crypto update? This varied a lot. Sometimes there actually were leaders assigned to it. In other cases, it was, ‘You just discovered yourself that there is this vulnerability, so it’s your job to fix it.’” As one of their key research contributions, the researchers have outlined such an update process, consolidating the heterogeneous statements that the participants had made. Their multi-step process assigns the three different stakeholder groups (internal, external, and end users) to six phases: trigger, goals, planning, execution, quality assurance, and rollout.

Other study results turned out to be both more positive and predictable for the research team, such as for example the motivations behind implementing cryptographic updates. “We were positively surprised overall that many developers are intrinsically motivated to ensure their software is future-proof”, Krause explains. In addition, preventive updates were performed to gain a security edge over future threats. Feedback was also fairly consistent regarding the perception that crypto updates are onerous and complex. Krause summarizes: “All our participants had very individual backgrounds and very individual projects, but overall, what makes updating crypto difficult is that you need the knowledge to do it—and at the end of the day many don’t have that. We expected this, since it’s the case in many areas of IT security, not just in the area of cryptographic implementations.”

Networking is key: A gap between research and practice

The question of how this knowledge gap could be closed in the interest of IT security continues to occupy Alexander Krause. “Crypto updates will remain a challenge going forward. But we see that people often lack the necessary education to manage them. The biggest challenge that we see—and this extends beyond our paper to crypto research more broadly—is translating new research findings into a format that actually reaches developers.” While gaining access to the relevant mailing lists is often difficult, the responses from the interview study have shown that software developers rarely use academic publication databases to stay informed about new developments. “In our study, those with a higher academic degree—a master’s or PhD—had an advantage here, because they bring the necessary skillset”, Krause explains. Ultimately, obtaining relevant information still largely depends on the personal initiative of individual developers. In this respect, there is a clear gap between research and practice that needs to be bridged—as there is very little overlap between the conferences important for scientific discourse and the trade fairs relevant to developer communities. The CISPA researchers have already made their findings available to all developers who participated in the interview study. They will also present their study to the scientific community on August 14, 2025, at the USENIX Security Symposium in Seattle, USA.

 

Oxford study outlines new blueprint to help tackle the biodiversity impacts of farming




University of Oxford





A study led by researchers at the University of Oxford, working closely with colleagues from Duurzame Zuivelketen (DZK), among others, has developed a framework to help agricultural sectors better contribute to global biodiversity targets without causing unintended harms.

Published today (11 August) in npj Biodiversity, the study is based upon data from the Dutch dairy sector in 2020, covering nearly 8,950 farms (approximately 1.6 million cows). They first established a single combined score to track biodiversity impacts against possible sectoral targets; but found that while using such a score can be helpful to track overall progress, such methods can mask important local impacts (such as nutrient pollution or habitat loss) that are currently poorly captured with existing biodiversity impact indicators.

To address this, the team developed a set of safeguards—clear, quantitative thresholds for major environmental pressures—to ensure progress in one environmental impact type does not come at the expense of other types elsewhere. If each of these safeguards were fully implemented, the researchers estimated this could take the sector most of the way towards biodiversity targets such as those outlined in the Global Biodiversity Framework.

Lead author Associate Professor Joseph Bull, from the University of Oxford’s Department of Biology, said: “Our study shows that – though they are extremely useful – relying on simplified, combined indicators to track agricultural impacts on biodiversity can mislead if used alone. By introducing scientifically grounded safeguards, we can ensure that improvements in one area don’t cause damage somewhere else.”

Measuring the biodiversity footprint of dairy farming

To understand where biodiversity losses were occurring, the team worked with a single composite index that translated environmental pressures - such as greenhouse gas emissions, land use change and ammonia emissions - into a common unit. This value reflects the proportion of species at risk of disappearing from a particular area.

The analysis revealed that, for some measures, the majority of biodiversity harm was not happening on Dutch farms directly, but overseas, particularly where land is cleared to grow livestock feed ingredients. These imported feeds were responsible for the largest share of land transformation impacts, and consequently biodiversity loss.

By contrast, nutrient pollution, while a major political and environmental issue within the Netherlands, had a relatively modest effect on global biodiversity loss. This shows the value of using multiple criteria to target the most effective interventions on biodiversity.

Developing and applying safeguards

To avoid unintended outcomes when using single metrics to track biodiversity progress, the researchers proposed a set of practical safeguards. These act as checks and balances, ensuring that improvements in one area do not mask problems in another.

The safeguards fall into two main categories:

  • Impact prevention: These include thresholds for the use of imported animal feed, limits on nitrogen and ammonia emissions, and targets for maintaining permanent grassland and biodiversity-rich habitats. Together, these measures aim to reduce environmental pressures before damage occurs.
  • Impact compensation: For biodiversity losses that cannot be avoided, the study outlines to what extent the sector could offset impacts by restoring similar habitats in the same regions where damage has occurred. Safeguards here include ensuring compensation happens promptly and is maintained long term.

The approach was developed in collaboration with Dutch dairy stakeholders, including industry bodies and conservation organisations, and is tailored to align with national policies and international biodiversity goals.

Pathways to nature-positive outcomes

The study outlines three feasible transition pathways for the dairy sector, from offset-heavy ‘adaptive compensation’ to a ‘deep net positive’ approach driven primarily by prevention. Each pathway presents trade-offs, particularly between production levels, land use, and restoration potential.

Co-author Dr Joseph Poore (Department of Biology, University of Oxford) added: “This study was an interesting test of whether single biodiversity impact scores, that capture many pathways causing biodiversity loss, are practical to implement and reliable. Our results generally found that these indicators serve a very useful role and the science is advancing fast. Soon we will probably know the biodiversity impacts of every product we buy in the shops and the biodiversity impacts of businesses and their choices. While not the solution to these problems in itself, this is certainly a step towards solving them.”

Notes to editors:

For media enquiries and interview requests, contact Associate Professor Joseph Bull: joseph.bull@biology.ox.ac.uk

The study ‘Towards positive net outcomes for biodiversity, and developing safeguards to accompany headline biodiversity indicators’ will be published at 10 AM BST / 5 AM ET Monday 11 August in npj Biodiversity at https://www.nature.com/articles/s44185-025-00095-5  To view a copy of the study before this under embargo, contact Joseph Bull: joseph.bull@biology.ox.ac.uk

About the University of Oxford

Oxford University has been placed number 1 in the Times Higher Education World University Rankings for the ninth year running, and ​number 3 in the QS World Rankings 2024. At the heart of this success are the twin-pillars of our ground-breaking research and innovation and our distinctive educational offer.

Oxford is world-famous for research and teaching excellence and home to some of the most talented people from across the globe. Our work helps the lives of millions, solving real-world problems through a huge network of partnerships and collaborations. The breadth and interdisciplinary nature of our research alongside our personalised approach to teaching sparks imaginative and inventive insights and solutions.

Through its research commercialisation arm, Oxford University Innovation, Oxford is the highest university patent filer in the UK and is ranked first in the UK for university spinouts, having created more than 300 new companies since 1988. Over a third of these companies have been created in the past five years. The university is a catalyst for prosperity in Oxfordshire and the United Kingdom, contributing around £16.9 billion to the UK economy in 2021/22, and supports more than 90,400 full time jobs.

The Department of Biology is a University of Oxford department within the Maths, Physical, and Life Sciences Division. It utilises academic strength in a broad range of bioscience disciplines to tackle global challenges such as food security, biodiversity loss, climate change, and global pandemics. It also helps to train and equip the biologists of the future through holistic undergraduate and graduate courses. For more information visit www.biology.ox.ac.uk.

Kunming-Montreal Global Biodiversity Framework advances protection of marine biodiversity




A new study finds that the global biodiversity targets adopted in 2022 are aligned with science and represent a major advance



University of Oldenburg

Biodiversity Framework 

image: 

In order to protect biodiversity in the oceans, it is necessary to observe and monitor species numbers and genetic diversity in ecosystems such as coral reefs.

view more 

Credit: Jan-Claas Dajka





In 2022, numerous countries signed the Kunming-Montreal Global Biodiversity Framework (GBF) with the goal of halting and reversing biodiversity loss. Among other things, the framework sets out various targets to be implemented by the signatory states in their national legislation. A core component is the commitment to put at least 30 percent of the world's land and oceans areas under protection by 2030. In the area of marine biodiversity, an international team of scientists led by Dr Jan-Claas Dajka from the Helmholtz Institute for Functional Marine Biodiversity (HIFMB) at the University of Oldenburg, Germany, and Anne Eilrich from Kiel University, Germany, concludes that these new targets represent a considerable improvement on the Aichi Targets adopted in 2010. In a study published in the scientific journal Frontiers in Ecology and the Environment, the team reports that the updated targets address the inherent complexity of biodiversity well, thus reducing the risk of missing critical changes in marine biodiversity.

A major challenge in the implementation of conservation measures is that setting a key target like the 1.5-degree limit for climate protection is not possible when it comes to protecting biodiversity. "Biodiversity occurs at different levels, from genes to species to ecosystems," explains Dajka, the lead author of the study. According to experts, one of the reasons why the Aichi Targets adopted by the parties to the UN Convention on Biological Diversity in 2010 failed was that in many cases progress in achieving them was not measurable. With the adoption of the GBF in 2022, the international community agreed on a new strategy.

In the view of the international team of researchers led by Dajka and Eilrich, the Montreal-Kunming framework establishes robust, science-based targets that address all the different levels of marine biodiversity: "Our analysis shows that global policy is catching up with science in recognising the complexity of marine biodiversity," Dajka explains. The scientists write that the framework provides national governments with a solid roadmap for implementation of the targets.

These findings are the result of a systematic review of the relevant literature, including marine biodiversity research from the period between 2010 and 2020 and various political agreements. The goal was to determine which indicators scientists and policymakers use to track changes in marine biodiversity.

According to the study, both scientific research and policy generally take six classes of variables, known among scientists as Essential Biodiversity Variables (EBVs), as their reference. These include spatial species distribution, genetic diversity, physical species traits and ecosystem structure. Together, the six classes provide a comprehensive overview of all facets of biodiversity. "If policymakers are guided by this framework, they can ensure that none of the biodiversity levels are overlooked," says Dajka.

The authors of the study also found that secondary classes such as ecosystem structure or ecosystem function have gained importance, both in the Kunming-Montreal framework and in other global biodiversity frameworks. However, they point out that certain variables in these classes, such as the percentage of protected area, are of limited use as indicators of a healthy ecosystem. "If, for example, a large area of a coral reef is protected, this is a good thing. But if only one coral species lives there, the ecosystem is still very fragile," Dajka explains. The scientists caution against overemphasising secondary variables at the expense of foundational ones such as the number of species or genetic diversity within a species. The latter form the basis of ecosystems and are still crucial for tracking real progress and initiating conservation measures, they stress.

The researchers also see their study as a tool to support the arguments of the political institutions tasked with implementing the framework at the national level. The scepticism expressed from various voices regarding the framework's goals is partly unjustified, Dajka explains: "The targets are good. Our recommendation to governments is to now focus on translating them into appropriate national-level indicators and on implementing science-based measures."