Tuesday, April 25, 2023

Exploring a new frontier in healthcare technology: Non-fungible tokens for secure health data management in a post-pandemic world?

Managing health data as non-fungible tokens (NFTs) could give patients full ownership of personal health data, resulting in better patient care and research outcomes.

Peer-Reviewed Publication

SINGHEALTH

Singapore, 24 April 2023 – Digital transformation in healthcare has been greatly catalysed by the COVID-19 pandemic, which resulted in the accelerated adoption of digital health solutions such as telemedicine, remote monitoring and the Internet of Medical Things (IoMT), robotics and artificial intelligence (AI). This caused an upward surge in the generation and flow of health data, which is expected to continue as healthcare providers and patients grow more accustomed to digital solutions, and healthcare systems gear up with emerging technologies to tackle the challenges of the future.

With this increase in digital health data, a team of clinician innovators from SingHealth anticipate a growing need for privacy-preserving solutions to empower patients to take greater ownership of their health and to enhance the applications of data for medical research and clinical care purposes. In a recently published editorial piece in the prestigious journal Nature Medicine[1], the team explored the use of non-fungible tokens (NFTs) as a potential data management solution to bridge this gap and revolutionise data exchange in healthcare.

An NFT is a unique digital data unit stored on the blockchain, under a single ownership that is irreplaceable, and which can be traded. Like digital assets that are traded as commercial NFTs today, health data can be minted, exchanged and stored using blockchain technology, bearing the same features of uniqueness, transparency and interoperability. This means that patients will be able to own their personal health data and exchange it as digital assets with multiple stakeholders, such as healthcare providers, using the same blockchain technology. Similar to how cryptocurrencies are traded with mobile wallets, each patient can own, store and share their health data in the form of NFTs using a health wallet hosted on a secure web-based or smartphone application, making this mode of data management easily accessible, yet secure and private.

The key difference between existing commercial NFT marketplaces and a blockchain ledger dedicated to the exchange of health data is that the health data ledger can be programmed to disallow public viewing of its data. When a patient needs to share their health data with a healthcare provider, they can give the healthcare provider access to and allow them to view the required information. This preserves patient privacy, only allowing data owners – the patients themselves – to permit the access and sharing of their personal health data.

Empowering Patients and Improving Care

Presently, patient data is safe kept and shared when necessary by institutions such as healthcare providers, research institutes, insurance companies and government bodies. Transiting to the use of NFTs will require a paradigm shift in mindset for patients and caregivers. Managing health data as NFTs will give the full ownership of personal health data to patients, entrusting them with the responsibility of storing the data and sharing it when necessary. This ensures accurate and complete health information from each individual, and empowers them to engage in their health journey more proactively, which has shown to produce better healthcare outcomes in the long run.

Patient ownership of health data may also allow for greater fluidity of healthcare information. Currently, patient data is protected under strict data privacy rules. The use of NFTs will shift the onus of sharing individual patient data to each patient, thus fostering a closer relationship between the healthcare provider and the patient.  

Ensuring Data Authenticity for Better Research Outcomes

When personal health data is owned by patients, any unauthorised access and use of data stored in personal health applications and institutional databases can be mitigated, as the owner of every piece of data has to give permission before it is shared. This gives patients full autonomy over their personal health data and who they wish to share it with, for research or any other purposes.

In addition, sharing health data as NFTs ensures complete transparency and accuracy of healthcare research data, due to the traceable and unalterable nature of the blockchain. This means that researchers can be certain of the authenticity of data being used in their research, leading to greater data integrity and better research outcomes.

The same technology can also be applied to other areas of healthcare, such as pharmaceuticals, where every drug produced can be encoded and stored on a blockchain ledger. From point of production to delivery to the end-user, the drug can be tracked throughout the entire supply chain. This enables drug verification to prevent counterfeit drugs, as well as prevents the misuse of drugs by healthcare providers and patients.

Dr Teo Zhen Ling, the lead author of the paper and Ophthalmology Resident, Singapore National Eye Centre said: “Using NFTs and blockchain technology to build a secure healthcare data exchange platform will greatly impact the way data is handled in both healthcare research and clinical pathways. At present, we see great potential for its application in areas such as clinical and pharmaceutical trials, where the ability to verify the authenticity of patient data is extremely vital to the accuracy of research findings. It will also enable us to ensure patient compliance in research trials where IoMT is being used to monitor and collect data on health activity and vital signs. Importantly, beyond research settings, the ability for patients to access and own their data supports patient autonomy and increases patients’ engagement in their own care.”

Associate Professor Daniel Ting, Director, Artificial Intelligence Office, SingHealth and Head, AI and Digital Innovation, Singapore Eye Research Institute, who is also the senior and corresponding author of the paper, said: “In this age of healthcare digitalisation and Industry 4.0, the generation and exchange of health data is expected to grow exponentially. From securely obtaining patient data for diagnosis and treatment, to the verification of the origin of massive data sets, strategic applications of blockchain technology, or other alternative privacy-preserving or enhancing technologies, in healthcare can bring about a stronger and safer infrastructure for health data management. Over time, it will also herald a paradigm shift in patient care and healthcare research as digital technologies and their applications continue to gain sophistication and become more broadly utilised in different industries.”

As with the introduction of any new technology, there are many important considerations to make and obstacles to overcome. Exploring the potential adoption of NFTs as an alternative privacy-preserving technology in healthcare is no different. These include assessing the ability to establish the proper technological infrastructure, such as a blockchain-enabled ‘biodata’ platform, as well as putting in place different forms of safeguard to ensure data security and mitigate risks such as theft of NFTs – which is not unheard of in the commercial NFT market. Nonetheless, NFTs in healthcare have many exciting potential benefits and could revolutionise the management of health data in time to come.

About Singapore Health Services (SingHealth)

SingHealth, Singapore’s largest public healthcare cluster, is committed to providing affordable, accessible and quality healthcare to patients. With a network of acute hospitals, national specialty centres, polyclinics and community hospitals offering over 40 clinical specialties, it delivers comprehensive, multi-disciplinary and integrated care. Beyond hospital walls, SingHealth partners community care providers to enable the population to keep well, get well and live well. As part of the SingHealth Duke-NUS Academic Medical Centre, SingHealth also focuses on advancing education and research to continuously improve care outcomes for patients. For more information, please visit: www.singhealth.com.sg

Members of the SingHealth group

Hospitals (Tertiary Specialty Care):

Singapore General Hospital, Changi General Hospital, Sengkang General Hospital and KK Women's and Children's Hospital

National Specialty Centres (Tertiary Specialty Care):

National Cancer Centre Singapore, National Dental Centre Singapore, National Heart Centre Singapore, National Neuroscience Institute, and Singapore National Eye Centre

SingHealth Polyclinics (Primary Care):

Bedok, Bukit Merah, Marine Parade, Outram, Pasir Ris, Punggol, Sengkang, Tampines, Eunos, Tampines North (expected completion: 2023) and Kaki Bukit (expected completion: 2025)

SingHealth Community Hospitals (Intermediate and Long-term Care):

Bright Vision Hospital, Sengkang Community Hospital, and Outram Community Hospital

[1] Teo, Z.L., Ting, D.S.W. Non-fungible tokens for the management of health data. Nat Med (2023). https://doi.org/10.1038/s41591-022-02125-2

International symposium on nature-based solutions in urban water systems

Meeting Announcement

PENSOFT PUBLISHERS

NICHES International Symposium on Nature-based Solutions in Urban Water Systems 

IMAGE: NICHES INTERNATIONAL SYMPOSIUM ON NATURE-BASED SOLUTIONS IN URBAN WATER SYSTEMS view more 

CREDIT: OWNED CONTENT

Barcelona, Spain, 6 March 2023 – The Biodiversa+ and WaterJPI-funded project NICHES held a two-part International Symposium on Nature-based Solutions in Urban Water Systems. The first part was held in English and welcomed both in person and online participants, focusing on introducing the project and its five case study cities. The second part, held in Catalan and Spanish, targeted local stakeholders and fostered a discussion around the desired future for the urban water system in the NICHES case-study city of Barcelona and the Barcelona Metropolitan Area (AMB). Around 20 stakeholders attended in person and 30 joined online. Key messages of the day are outlined below. 

The Symposium began by detailing the NICHES co-design processes and how the project builds on traditional approaches of urban water management and uses nature-based solutions (NBS) for mitigating and preventing combined sewer overflow (CSO) events. “NICHES utilises social-ecological-technical systems (SETS) approach for integrating innovative ideas for water management in cities with the aim of increasing the resilience and sustainability of urban waterscapes and aquatic biodiversity,” said project coordinator, McKenna Davis from the Ecologic Institute in Berlin in her opening words. 

Part 1: NICHES cities and co-design arenas  

Part 1 focused on sharing a perspective on hazards and social vulnerabilities from New York City in the USA, presented by PhD candidate Pablo Herreros. He outlined how extreme precipitation caused by climate change is unable to be absorbed by the current aquatic infrastructure in the city.  

The American perspective was reinforced by an exploration of another coastal city and NICHES case-study - Boston - presented by project partner Matthew Eckelman from Northeastern University. “As a coastal city, Boston is experiencing the severe consequences of climate change and to address this, the city has embraced NBS for managing stormwater run-off, preventing CSO events, and dissipating energy from waves to protect the coast,” shares Dr Eckelman. Various green infrastructures have been set up in Boston such as infiltration chambers, drywells, and permeable pavements.  

The NICHES Symposium continued with illustrating the unique challenges and state of NBS in Europe - namely in Berlin and Rotterdam - presented by NICHES partners Gregory Fuchs from the Ecologic Institute in Berlin and Dr Lisette de Senerpont Domis from the Netherlands Institute of Ecology (NIOO-KNAW). Berlin’s infrastructure is predominantly grey, which impedes the natural flow of water bodies and causes an accumulation of wastewater. This, combined with the impact of climate change has led to a severe volume of overflow events. Berlin has had a tide of transformative policies on integrating NBS in the solutions of these problems.  

In the case of Rotterdam, due to being below sea level, the city experiences flooding and challenges with saltwater infiltration. As a major European port, heavy industrialisation is endemic to the city, which creates additional difficulties with water management. Furthermore, Rotterdam has a traditionally built sewage system prone to CSO events. Dr. De Senerpont Domis outlined that “the city’s ambition is to transform urban spaces to include more green water retention areas alongside grey solutions such as rainwater squares.” 

Part 2: Barcelona stakeholder workshop 

Part 2 of the NICHES International Symposium focused on discussing the case-study city of Barcelona and the AMB with local stakeholders. Around 20 practitioners and academics from the field, as well as representatives from civil society participated in the discussion. Dr. Sara Maestre Andrés led the workshop, which aimed to gauge stakeholders’ views on the current state and desired future for the urban water system in the AMB.  

Divided into smaller groups, the participants discussed the resilience of the current water management system and agreed it functions well in dry conditions but during heavier precipitation events, combined sewer overflows happen regularly. Another challenge stakeholders identified is that the region is facing extreme droughts. “We lack water, and yet we drain the rainwater and allow it to get contaminated with residual waters,” shared a participant.  

In sum, the stakeholder workshop participants see an urgent need for changes in the water management of AMB, incorporating NBS for reuse of rainwater as well as increasing public awareness and inclusive decision-making. NICHES aims to co-create possible transition pathways with the stakeholders so as to enable a systemwide shift towards more resilient urban waterscapes in the AMB.  

For more information about NICHES visit www.niches-project.eu.

**** 

This project was funded through the 2020-2021 Biodiversa and Water JPI joint call for research proposals, under the BiodivRestore ERA-Net COFUND programme, and with the funding organisations: German Federal Ministry of Education and Research, Agencia Estatal de Investigación, Ministry of Agriculture, Nature and Food Quality of the Netherlands. 

Uncovering the real paleo diet: Scientific team wins HFSP research grant

An international research team including Tina Lüdecke from MPI for Chemistry won a Research Grant from the Human Frontier Science Program (HFSP) – Can they find a new way to look deeper into isotopic composition of amino acids in tooth enamel?

Grant and Award Announcement

MAX PLANCK INSTITUTE FOR CHEMISTRY

Dentition of a modern baboon 

IMAGE: DENTITION OF A MODERN BABOON (PAPIO URSINUS). THESE SAVANNA DWELLING OMNIVORES PROVIDE A PRIME ANALOGOUS MODEL FOR EARLY HOMININ EVOLUTION. THEY EVOLVED AND RADIATED IN PARALLEL WITH HOMININS WITHIN A SIMILAR LANDSCAPE AND TIME FRAME. TINA LÜDECKE AND HER COLLEAGUES USE TOOTH ENAMEL FROM DENTITIONS LIKE THIS TO DEVELOP THE NEW METHOD. view more 

CREDIT: TINA LÜDECKE

Emmy Noether Group leader Tina Lüdecke from the Max Planck Institute for Chemistry (MPIC) in Mainz, has been awarded a prestigious and highly competitive Human Frontier Science Program (HFSP) Research Grant along with Cajetan Neubauer from the University of Colorado Boulder (Institute of Arctic and Alpine Research) and Rani Bakkour from the Technical University of Munich (TUM).

The three-year funding, around U.S. $1 million in total, will support the international scientific team led by principal investigator Cajetan Neubauer to work on the project "Uncovering the real paleo diet: Novel isotope analytics of amino acids from fossil hominin teeth.“ The team aims to develop a new method to measure the isotopic composition of amino acids in tooth enamel and thus get more details on the hominin diet.

“Much of our understanding of the relationship between hominin diet and evolution is based on anatomical and archaeological information derived from hominin fossils,” explains Tina Lüdecke.

Direct chemical evidence of paleodiets has also been measured, in fossil bone collagen and tooth enamel, in the form of stable carbon isotope patterns that are indicative of food intake. Proteins and amino acids are likely preserved in enamel even millions of years old and their isotopic compositions could provide specific insights into how ecosystem use and dietary changes in human prehistory shaped human biology, societies, and cultures. “Unfortunately, currently no technique exists that can reveal paleodiet signatures from fossil amino acids. This is what we want to change with our funded project”, she said.

“The winners in this year’s HFSP Research Grant Program are remarkable scientists pioneering life science research that needs international collaboration and basic science in frontier subjects – that is, investigations for which there are no prior studies,” said Pavel Kabat, HFSP Secretary-General. “I was thrilled with the proposals we received and look forward to the ground-breaking discoveries that will be revealed.”

New Method explored

“Our team hypothesizes that recent advances open a path to achieve the ultimate dream for the isotope approach to learning about human evolution: highly sensitive detection of intact fossilized metabolites and full description of the paleodietary information they record,” says Cajetan Neubauer. This international grant brings together a team that has key complementary skills in analytical chemistry, isotope analytics, and paleoanthropology, says Neubauer.

Tina Lüdecke´s group has recently developed a method to analyze nitrogen isotopes in bulk tooth enamel to evaluate early hominin meat consumption for the first time. However, amino acids analyses are very desirable to clarify which animal resources were consumed (carnivores/herbivores), if fish or mushrooms were utilized, the role of breastfeeding, and if our ancestors hunted or instead scavenged. Most importantly, enamel amino acids could provide information about the use of fire, believed to be crucial for the evolution of large brains, as cooked foods provide much more energy than raw materials.

“My team collects samples, i.e. teeth from hominins, but first of all from recent and fossil large mammals to develop and test the method and then to evaluate the results paleoarcheologically,” Lüdecke explains.

Molecular imprinting at TUM

Rani Bakkour and his team will then isolate very small amounts of amino acids from this enamel. The researchers from TUM have extensive expertise in environmental analytical chemistry, where they synthesize and evaluate highly selective materials for the extraction of aquatic contaminates. "We synthesize macromolecules that can recognize only one molecule at a time, a technique known as molecular imprinting," explains Bakkour. "We employ this technique to isolate minute amounts of contaminants such as glyphosate from complex mixtures." This method is particularly exciting in paleoanthropology. "Selectivity is a key given the very small amounts of amino acids in tooth enamel and the very small size of the precious fossil samples."

Novel isotope analysis at CU Boulder

Afterwards, Cajetan Neubauer will measure them subsequently with a newly developed isotopic technique (Iso-Orbi) in Colorado. What Neubauer has developed at CU Boulder is an innovative and powerful isotopic technique that brings isotope analysis into the realm of structural chemistry. It allows measurement of isotopic “fingerprints” in polar chemical compounds by electrospray-Orbitrap mass spectrometry. Iso-Orbi reveals the isotopic anatomy of amino acids and can thereby provide a wealth of new multi-elemental and structural isotopic information that we anticipate will be reflective of paleo diet and environmental factors.

“Our goal is to develop a new way to look deeper into fossil molecules that will transform anthropology by exploring this emerging frontier,” summarizes Tina Lüdecke.

Toward ubiquitous and intelligent 6G networks: From architecture to technology

Peer-Reviewed Publication

SCIENCE CHINA PRESS

Special Topic: Spectrum, Coverage, and Enabling Technologies for Intelligent 6G 

IMAGE: SCIENCE CHINA INFORMATION SCIENCES ORGANIZED A SPECIAL TOPIC ON SPECTRUM, COVERAGE, AND ENABLING TECHNOLOGIES FOR INTELLIGENT 6G (VOL.66, ISSUE.3, 2023). view more 

CREDIT: ©SCIENCE CHINA PRESS

The sixth-generation (6G) network is envisioned to support growing Internet connections with the requirements of higher transmission rate, higher reliability, lower latency, etc. While conventional methods are hard to meet such demands, the newly emerging technology, artificial intelligence (AI), is a promising way to empower the 6G communication system. In recent years, there have been a growing number of researches on intelligent 6G, including wireless technologies using full spectrum and enlarging coverage. Besides, edge learning, reconfigurable intelligent surface, and cell-free technology have been proposed as significant enabling technologies for intelligent 6G. To promote the research in this area, SCIENCE CHINA Information Sciences has organized a special topic on spectrum, coverage, and enabling technologies for intelligent 6G.

Edge learning is a typical emerging research area in the future 6G era, which proposes a stringent demand on latency, reliability, and capacity. One way to accommodate this requirement is to apply integrated sensing, computing, and communication (ISCC) technology. In the contribution entitled “Pushing AI to wireless network edge: an overview on integrated sensing, communication, and computation towards 6G,” Zhu et al. provide a comprehensive overview on ISCC towards AI applications by introducing representative works on the three application scenarios, i.e., centralized edge learning, federated edge learning, and edge inference.

Full spectrum is a key technology for supporting ubiquitous connectivity and realizing Tbps-scale data rate in 6G wireless networks. In the contribution entitled “SpectrumChain: a disruptive dynamic spectrum-sharing framework for 6G,” Wu et al. introduce a blockchain-based dynamic spectrum sharing (DSS) framework. By utilizing the advantages of blockchain decentralization, transparency, and traceability, the spectrum provider and requestor can fulfill spectrum sharing without any third proxy. Furthermore, the authors propose a hierarchical blockchain DSS framework to exploit wider spectrum and achieve lower latency. Currently, the research on blockchain-based DSS is in the preliminary stage, more research efforts on blockchain-based DSS should be required.

6G satellite-terrestrial integrated network (STIN) is promising for the improvement of wireless coverage. In the contribution entitled “Coverage enhancement for 6G satellite-terrestrial integrated networks: performance metrics, constellation configuration and resource allocation,” Sheng et al. focus on improving the wireless coverage capability in 6G STINs, and summarize the performance metrics and critical technologies of service coverage structure. They investigate the impact of satellite constellation configuration and present a suitable network structure. Besides, intelligent resource scheduling and satellite-terrestrial collaborative computing are studied, followed by the research challenges and future directions.

Reconfigurable intelligent surface (RIS) has been recognized as an essential enabling technology for 6G networks. In the contribution entitled “Reconfiguring wireless environments via intelligent surfaces for 6G: reflection, modulation, and security,” Xu et al. elaborate on two functions of RIS, i.e., reflection and modulation, as well as their benefits to wireless communication systems. In addition, the authors also propose a typical case study to exemplify the benefits of RIS for secure communications.

Cell-free massive MIMO (CF-mMIMO) is considered as a key technique to help realize extremely high spectral efficiency and ultra-reliable low-latency transmission for 6G. In the contribution “Full-spectrum cell-free RAN for 6G systems: system design and experimental results,” Wang et al. propose a fullspectrum cell-free radio access network (CF-RAN) architecture to balance performance and complexity. Key transmission techniques, including channel information acquisition, transceiver design, and dynamic resource allocation, are introduced to support the full-spectrum CF-RAN.

In addition, experimental results of a prototype system are presented to demonstrate the superior performance of the proposed architecture. In addition to research on enabling technologies, there are also a growing number of theoretical researches related to intelligent 6G. In the contribution entitled “6G extreme connectivity via exploring spatiotemporal exchangeability,” You reveals that the requirement of extremely low-latency communication in 6G results in a phenomenon called channel capacity collapse effect. Based on the spatiotemporal exchangeability theory of MIMO channels, spatiotemporal 2-D channel coding is discussed under rich and sparsely scattering channels. It is expected as a promising approach to enable 6G extreme connectivity.

Please find below details of this Special Topic: Spectrum, Coverage, and Enabling Technologies for Intelligent 6G.

  1. Xu W, Huang Y M, Wang W, et al. Toward ubiquitous and intelligent 6G networks: from architecture to technology. Sci China Inf Sci, 2023, 66(3): 130300

https://link.springer.com/article/10.1007/s11432-023-3704-8

  1. Zhu G X, Lyu Z H, Jiao X, et al. Pushing AI to wireless network edge: an overview on integrated sensing, communication, and computation towards 6G. Sci China Inf Sci, 2023, 66(3): 130301

https://link.springer.com/article/10.1007/s11432-022-3652-2

  1. Wu Q H, Wang W, Li Z G, et al. SpectrumChain: a disruptive dynamic spectrum-sharing framework for 6G. Sci China Inf Sci, 2023, 66(3): 130302

https://link.springer.com/article/10.1007/s11432-022-3692-5

  1. Sheng M, Zhou D, Bai WG, et al. Coverage enhancement for 6G satellite-terrestrial integrated networks: performance metrics, constellation configuration and resource allocation. Sci China Inf Sci, 2023, 66(3): 130303

https://link.springer.com/article/10.1007/s11432-022-3636-1

  1. Xu J D, Yuen C, Huang C W, et al. Reconfiguring wireless environment via intelligent surfaces for 6G: reflection, modulation, and security. Sci China Inf Sci, 2023, 66(3): 130304

https://link.springer.com/article/10.1007/s11432-022-3626-5

  1. Wang D M, You X H, Huang Y M, et al. Full-spectrum cell-free RAN for 6G systems: system design and experimental results. Sci China Inf Sci, 2023, 66(3): 130305

https://link.springer.com/article/10.1007/s11432-022-3664-x

  1. You X H. 6G extreme connectivity via exploring spatiotemporal exchangeability. Sci China Inf Sci, 2023, 66(3): 130306

https://link.springer.com/article/10.1007/s11432-022-3598-4

It’s not as difficult as you think to shout upwind

Researchers unveil and explain a common-sense misunderstanding


AALTO UNIVERSITY

Car model of shouting upwind 

IMAGE: TO MAKE THE MEASUREMENTS, A CAR WAS USED TO MOVE A MODEL OF A SHOUTER, GENERATING WIND PAST IT. view more 

CREDIT: VILLE PULKKI / AALTO UNIVERSITY

For years, Ville Pulkki has been wondering why it feels so difficult to shout upwind. The sensation is common enough to have found its way into an idiom about not being understood. But Pulkki, a professor of acoustics at Aalto University, wanted a scientific explanation for the phenomenon – and there wasn’t been one.

In a new study published in Nature’s Scientific Reports, Pulkki’s research team showed that our common sense understanding of this situation is wrong. It isn’t harder to shout into the wind; it’s just harder to hear yourself.

In fact, acousticians have long known that sound carries better within the first 100 metres upwind. Many people have noticed that a siren sounds louder as it approaches and then quieter as it moves away. The mechanics behind this is similar to the Doppler effect, in which a sound changes frequency as it moves. 

Pulkki’s earlier research had confirmed that wind doesn’t affect the emanation pattern of speech, so there was no reason why shouting into the wind would be difficult. He therefore asked one of his master's students, Rapolas Daugintis, to study whether the phenomenon was due to how we hear. Daugintis carried out measurements and simulations to test the idea, and Senior Researcher Timo Lähivaara from the University of Eastern Finland contributed acoustic and flow field simulations.  

Their results were surprising but simple: it’s harder for people to hear themselves when shouting upwind. 

‘When someone shouts upwind, their ears are situated downwind from their mouth, which means that their ears receive less sound – it’s harder from them to hear their shout than when there’s no wind,’ says Pulkki.

The same thing happens when someone is moving quickly even if there’s no wind blowing – if you’re cycling, for example. As a person bikes, their motion generates a wind around their head even in stationary air, and they end up shouting because they can’t hear their own voice well.

So be careful what you shout upwind, for others might hear you just fine, even if you don’t. This information is particularly useful for people who work with sound, such as musicians.

‘My musician friend told me that when they have to sing on a sailboat, they always sit with their back against the wind in order to not strain their voice. The same phenomenon is at play here: because it’s harder for my friend to hear themself when singing upwind, it makes them unknowingly sing louder than usual,’ says Pulkki.

A more precise model of the Earth's ionosphere

With the help of neural networks, the complexity of the layer around the Earth can be reconstructed much better than before. This is important for satellite navigation, among other things.

Peer-Reviewed Publication

GFZ GEOFORSCHUNGSZENTRUM POTSDAM, HELMHOLTZ CENTRE

Model of the Ionosphere 

IMAGE: ELECTRON DENSITY OF THE IONOSPHERE AROUND THE EARTH FOR A CERTAIN POINT OF TIME: HIGH VALUES IN RED, LOW VALUES IN BLUE. THE WHITE LINE MARKS THE GEOMAGNETIC EQUATOR. view more 

CREDIT: CCBY 4.0 SMIRNOV ET AL. (2023) - SCIENTIFIC REPORTS (HTTPS://DOI.ORG/10.1038/S41598-023-28034-Z)

Summary

The ionosphere – the region of geospace spanning from 60 to 1000 kilometres above the Earth – impairs the propagation of radio signals from global navigation satellite systems (GNSS) with its electrically charged particles. This is a problem for the ever higher precision required by these systems – both in research and for applications such as autonomous driving or precise orbit determination of satellites. Models of the ionosphere and its uneven, dynamic charge distribution can help correct the signals for ionospheric delays, which are one of the main error sources in GNSS applications. Researchers led by Artem Smirnov and Yuri Shprits of the GFZ German Research Centre for Geosciences have presented a new model of the ionosphere in the journal Nature Scientific Reports, developed on the basis of neural networks and satellite measurement data from 19 years. In particular, it can reconstruct the topside ionosphere, the upper, electron-rich part of the ionosphere much more precisely than before. It is thus also an important basis for progress in ionospheric research, with applications in studies on the propagation of electromagnetic waves or for the analysis of certain space weather events, for example.

Background: Importance and complexity of the ionosphere

The Earth's ionosphere is the region of the upper atmosphere that extends from about 60 to 1000 kilometres in altitude. Here, charged particles such as electrons and positive ions dominate, caused by the radiation activity of the Sun – hence the name. The ionosphere is important for many scientific and industrial applications because the charged particles influence the propagation of electromagnetic waves such as radio signals. The so-called ionospheric propagation delay of radio signals is one of the most important sources of interference for satellite navigation. This is proportional to the electron density in the space traversed. Therefore, a good knowledge of the electron density can help in correcting the signals. In particular, the upper region of the ionosphere, above 600 kilometres, is of interest, since 80 per cent of the electrons are gathered in this so-called topside ionosphere.

The problem is that the electron density varies greatly – depending on the longitude and latitude above the Earth, the time of day and year, and solar activity. This makes it difficult to reconstruct and predict them, the basis for correcting radio signals, for example.

Previous models

There are various approaches to modelling electron density in the ionosphere, among others, the International Reference Ionosphere Model IRI, which has been recognised since 2014. It is an empirical model that establishes a relationship between input and output variables based on the statistical analysis of observations. However, it still has weaknesses in the important area of the topside ionosphere because of the limited coverage of previously collected observations in that region.

Recently, however, large amounts of data have become available for this area. Therefore, Machine learning (ML) approaches lend themselves to deriving regularities from this, especially for complex non-linear relationships.

New approach using machine learning and neural networks

A team from the GFZ German Research Centre for Geosciences around Artem Smirnov, PhD student and first author of the study, and Yuri Shprits, head of the “Space Physics and Space Weather” section and Professor at University Potsdam, took a new ML-based empirical approach. For this, they used data from satellite missions from 19 years, in particular CHAMP, GRACE and GRACE-FO, which were and are significantly co-operated by the GFZ, and COSMIC. The satellites measured – among other things – the electron density in different height ranges of the ionosphere and cover different annual and local times as well as solar cycles.

With the help of Neural Networks, the researchers then developed a model for the electron density of the topside ionosphere, which they call the NET model. They used the so-called MLP method (Multi-Layer Perceptrons), which iteratively learns the network weights to reproduce the data distributions with very high accuracy.

The researchers tested the model with independent measurements from three other satellite missions.

Evaluation of the new model

“Our model is in remarkable agreement with the measurements: It can reconstruct the electron density very well in all height ranges of the topside ionosphere, all around the Globe, at all times of the year and day, and at different levels of solar activity, and it significantly exceeds the International Reference Ionosphere Model IRI in accuracy. Moreover, it covers space continuously,” first author Artem Smirnov sums up.

Yuri Shprits adds: “This study represents a paradigm shift in ionospheric research because it shows that ionospheric densities can be reconstructed with very high accuracy. The NET model reproduces the effects of numerous physical processes that govern the dynamics of the topside ionosphere and can have broad applications in ionospheric research.”

Possible applications in ionosphere research

The researchers see possible applications, for instance, in wave propagation studies, for calibrating new electron density data sets with often unknown baseline offsets, for tomographic reconstructions in the form of a background model, as well as to analyse specific space weather events and perform long-term ionospheric reconstructions. Furthermore, the developed model can be connected to plasmaspheric altitudes and thus can become a novel topside option for the IRI.

The developed framework allows the seamless incorporation of new data and new data sources. The retraining of the model can be done on a standard PC and can be performed on a regular basis. Overall, the NET model represents a significant improvement over traditional methods and highlights the potential of neural network-based models to provide a more accurate representation of the ionosphere for communication and navigation systems that rely on GNSS.

Better superconductors with palladium

A Goldilocks material that might be just right: the precious metal palladium could be used to make superconductors that remain superconducting even at relatively high temperatures, show calculations by TU Wien.

Peer-Reviewed Publication

VIENNA UNIVERSITY OF TECHNOLOGY

It is one of the most exciting races in modern physics: How can we produce the best superconductors that remain superconducting even at the highest possible temperatures and ambient pressure? In recent years, a new era of superconductivity has begun with the discovery of nickelates. These superconductors are based on nickel, which is why many scientists speak of the “nickel age of superconductivity research”. In many respects, nickelates are similar to cuprates, which are based on copper and were discovered in the 1980s.

But now a new class of materials is coming into play: In a cooperation between TU Wien and universities in Japan, it was possible to simulate the behaviour of various materials more precisely on the computer than before. There is a "Goldilocks zone" in which superconductivity works particularly well. And this zone is reached neither with nickel nor with copper, but with palladium. This could usher in a new “age of palladates" in superconductivity research. The results have now been published in the scientific journal Physical Review Letters.

The search for higher transition temperatures

At high temperatures, superconductors behave very similar to other conducting materials. But when they are cooled below a certain "critical temperature", they change dramatically: their electrical resistance disappears completely and suddenly they can conduct electricity without any loss. This limit, at which a material changes between a superconducting and a normally conducting state, is called the "critical temperature".

"We have now been able to calculate this  "critical temperature" for a whole range of materials. With our  modelling on high-performance computers, we were able to predict the phase diagram of nickelate superconductivity with a high degree of accuracy, as the experiments then showed later," says Prof. Karsten Held from the Institute of Solid State Physics at TU Wien.

Many materials become superconducting only just above absolute zero (-273.15°C), while others retain their superconducting properties even at much higher temperatures. A superconductor that still remains superconducting at normal room temperature and normal atmospheric pressure would fundamentally revolutionise the way we generate, transport and use electricity. However, such a material has not yet been discovered. Nevertheless, high-temperature superconductors, including those from the cuprate class, play an important role in technology - for example, in the transmission of large currents or in the production of extremely strong magnetic fields.

Copper? Nickel? Or Palladium?

The search for the best possible superconducting materials is difficult: there are many different chemical elements that come into question. You can put them together in different structures, you can add tiny traces of other elements to optimise superconductivity. "To find suitable candidates, you have to understand on a quantum-physical level how the electrons interact with each other in the material," says Prof. Karsten Held.

This showed that there is an optimum for the interaction strength of the electrons. The interaction must be strong, but also not too strong. There is a “golden zone” in between that makes it possible to achieve the highest transition temperatures.

Palladates as the optimal solution

This golden zone of medium interaction can be reached neither with cuprates nor with nickelates - but one can hit the bull's eye with a new type of material: so-called palladates. "Palladium is directly one line below nickel in the periodic table. The properties are similar, but the electrons there are on average somewhat further away from the atomic nucleus and each other, so the electronic interaction is weaker," says Karsten Held.

The model calculations show how to achieve optimal transition temperatures for palladium data. "The computational results are very promising," says Karsten Held. "We hope that we can now use them to initiate experimental research. If we have a whole new, additional class of materials available with palladates to better understand superconductivity and to create even better superconductors, this could bring the entire research field forward."

Algae in Swedish lakes provide insights to how complex life on Earth developed

Peer-Reviewed Publication

LUND UNIVERSITY

By studying green algae in Swedish lakes, a research team, led by Lund University in Sweden, has succeeded in identifying which environmental conditions promote multicellularity. The results give us new clues to the amazing paths of evolution.

The evolution of multicellular life has played a pivotal role in shaping biological diversity. However, we have up until now known surprisingly little about the natural environmental conditions that favour the formation of multicellular groups.

The cooperation between cells within multicellular organisms has enabled eyes, wings and leaves to evolve. The predominant explanation for why multicellularity evolves is that being in a group enables species to better cope with environmental challenges – where being in a large group can, for instance, protect cells against being eaten.

"Our results challenge this idea, showing that multicellular groups form, not because they are inherently beneficial, but rather as a by-product of single-celled strategies to reduce environmental stress. In particular, cells produce a range of substances to protect themselves from the environment and these substances appear to prevent daughter cells from dispersing away from their mother cell", says Charlie Cornwallis, biology researcher at Lund University.

To understand how and why single-celled organisms evolve to be multicellular, the scientists experimented on green algae where some species are always single-celled, some are single-celled but become multicellular under certain conditions, while others are always multicellular containing thousands of cells. They could then identify the environmental conditions that promote multicellularity and find out the benefits and costs for organisms. The researchers then combined data with information on the environments that single-celled and multicellular green algae are adapted to across the whole of Sweden.

"I was surprised that there were no benefits or costs to living in multicellular groups. The conditions that individual cells experience can be extremely different when swimming around on their own, to being stuck to other cells and having to coordinate activities. Imagine you were physically tied to your family members, I think it would have quite an effect on you", says Charlie Cornwallis.

The study was conducted in Swedish lakes, and it not only provides information on which green algae occur where, and why – it also helps us understand the origins of biological diversity that shape the world around us.

"The results of this study contribute to our understanding of how complex life on Earth has evolved. They also provide information on how a key group of species – green algae that generate fuel for ecosystems – are able to reproduce and survive under different environmental conditions. The next time you walk along the shores of a lake rich in nitrogen just imagine that this fosters the evolution of multicellular life", says Charlie Cornwallis.

120-year-old storm’s secrets key to understanding weather risks

Peer-Reviewed Publication

UNIVERSITY OF READING

A severe windstorm that battered the UK more than a century ago produced some of the strongest winds[OS1]  that Britain has ever seen, a team of scientists have found after recovering old weather records. 

Old weather measurements, first recorded on paper after Storm Ulysses hit the UK in February 1903, have shed new light on what was one of the most severe storms to have hit the British Isles.

By turning hand-written weather data into digital records, the research team has laid the way to better understand other historical storms, floods and heatwaves. These observations from the past can help experts to understand the risks of extreme weather now and in the future. 

Professor Ed Hawkins, a climate scientist at the University of Reading and the National Centre for Atmospheric Science, led the research. He said: “We knew the storm we analysed was a big one, but we didn’t know our rescued data would show that it is among the top four storms for strongest winds across England and Wales. 

“This study is a great example of how rescuing old paper records can help us to better understand storms from decades gone by. Unlocking these secrets from the past could transform our understanding of extreme weather and the risks they pose to us today.”

Into the archives

Published today (Monday, 24 April) in Natural Hazards and Earth System Sciences, the research indicates that many storms that occurred before 1950 are left unstudied as billions of pieces of data exist only on paper, stored in archives around the world.

But a team of scientists led by Professor Hawkins delved into the archives to convert hand-written observations relating to Storm Ulysses from paper to digital. The cyclone caused multiple deaths and heavily damaged infrastructure and ships when it passed across Ireland and the UK between 26 and 27 February 1903. 

Using the new digital data, the research team was able to use techniques similar to modern weather forecasting to simulate the storm and accurately assess the strength of Storm Ulysses' winds. Comparisons with independent weather observations, such as rainfall data, as well as photographs and written accounts from 1903 that outlined the devastation caused by the cyclone, helped to provide credibility for the reconstruction.

The reanalysis is beneficial for understanding the risks of extreme weather events as it showed that the winds experienced in some locations during Storm Ulysses would be rarer than once in 100 years. Having information about such a rare event provides valuable insight into the potential damage a similar storm could cause now in the future.

The 1903 storm is named Storm Ulysses because the damage to thousands of trees in Dublin is mentioned in the novel Ulysses by James Joyce, the events of which are set the year after the storm.

Rescuing the weather

The rescuing of atmospheric observations related to Storm Ulysses is not the first time Professor Ed Hawkins has led weather record recovery. National rainfall data from as far back as 1836 became available in 2022 after the University’s Department of Meteorology and 16,000 volunteers helped to restore 5.2 million observations. 

The Rainfall Rescue project provided more context around recent changes in rainfall due to human-caused climate change.