Thursday, September 21, 2023

 

Plant and forest researchers: do not “anthropomorphize” plants


Many attributions in popular publications lack scientific evidence


Peer-Reviewed Publication

HEIDELBERG UNIVERSITY



Plants are often attributed with abilities similar to those known in the animal or human world. Trees are said to have feelings and can therefore care for their offspring, like mothers. In an article in the review journal Trends in Plant Science, 32 international plant and forest researchers followed up on such assertions. Led by Prof. David G. Robinson, professor emeritus for cell biology at the Centre for Organismal Studies (COS) of Heidelberg University, the researchers analysed the claims in two popular publications on forests and reached the conclusion that conjecture is equated with fact. They warn against “anthropomorphising” plants.

The article scrutinised the assertions in two widely received books about the hidden life of trees and the search for the so-called “mother tree”. The researchers report that in those works, trees are attributed with human characteristics and behaviours, including the ability to feel pain and pleasure, to communicate with one another, and to act altruistically. Based on existing research literature, Prof. Robinson and his co-authors provide detailed evidence that the main assertions are scientifically untenable. The Heidelberg researcher points out that numerous research papers on the significance of intraspecies competition clearly contradict the contention that trees of a single species support one another and keep each other alive.

According to Prof. Robinson and his colleagues, newer studies also render the “mother tree concept” untenable. Many publications based on this concept that presume to substantiate a targeted transfer of carbon from older to younger trees via networked fungi – the mycorrhizae – are flawed due to a lack of control variants. “And where the data does actually suggest such a transfer, the quantity of carbon transferred is so small that it is physiologically completely irrelevant for the recipient tree”, states Prof. Robinson. The researchers also criticise that both books cite evidentiary sources that were not peer-reviewed.

Finally, the authors point out the fatal consequences such claims could have for the adaptation of forests to climate change if political decisions are “based on pleasant-sounding but false messages” rather than scientific fact, adds Robinson. The article's authors included researchers from the University of Göttingen as well as from Austria, Canada, Chile, Great Britain, Ireland, Israel, Spain, Sweden, Switzerland, and the USA. They represent the fields of biology, forestry, and plant science.

 

From 'Money Heist' to 'Squid Game': the glocal strategy that keeps Netflix afloat


Non-English-language series accounted for 38% of the series with the most accumulated viewing hours in more than fifty countries between June 2021 and December 2022

Peer-Reviewed Publication

UNIVERSITAT OBERTA DE CATALUNYA (UOC)




Money Heist is the most watched Spanish series of all time thanks to Netflix, which launched it internationally after purchasing it from Atresmedia. The last episode premiered in autumn 2021, a few weeks after the North American company began broadcasting Squid Game, the South Korean series that became the most viewed content on the platform. These successes are no coincidence. "It's not that a series suddenly greatly appeals to people. It's the result of a meticulous strategy involving recommendations, dubbing, highly engaging marketing, and serving up the content in a thousand ways so that you end up watching it."

That is the explanation given by the researcher and member of the GAME group, Elena Neira, a course instructor in the Faculty of Information and Communication Sciences at the Universitat Oberta de Catalunya (UOC), and the co-author of a research article published this summer that focuses on Netflix's "glocal" strategy. To put it another way, this is the company's commitment to producing and offering non-English-language content created outside the United States. According to the article, series in languages other than English accounted for 38% of the series with the most accumulated viewing hours on the platform in more than fifty countries between June 2021 and December 2022.

The research, co-authored with the UOC faculty members Judith Clares Gavilán and Jordi Sánchez Navarro, focuses on the key factors in the delocalization of the company's productions at a time when, in the wake of the pandemic bubble, Netflix is the only streaming video platform that is not losing money.

According to Neira, who is a specialist in new audiovisual distribution models, Netflix's glocal strategy is a response to the need to be more profitable, since production is much cheaper in Spain or South Korea, for example. This strategy also reinforces its brand image. At the same time, it enables it to comply with the European directive that stipulates that 30% of the content of catalogues must be created in Europe.

The three UOC experts used the data published by the company on the Netflix Top 10 and Netflix TechBlog websites in order to understand how all this non-English content is produced and broadcast. This information is not comprehensive, but according to Neira, Netflix is the platform that offers the most information, in an environment in which transparency is conspicuous by its absence: "The platforms are under increasing pressure to be more transparent, because a lack of information leads to an abuse of a dominant position." The researchers also interviewed workers at the company, and used other sources of specialized information.

 

Big data and meticulous initiatives

According to the authors of the article, Netflix has been able to boost the success of series such as Money Heist and Squid Game thanks, above all, to content indexing and monitoring users' preferences. "Netflix is not a media company. It's a tech company. Its management already started storing information and learning things back when it was a DVD rental company. When they launched their streaming business in 2007, they had a great deal of information, and they integrated all these technological aspects into their work," Neira said.

In specific terms, Netflix uses its ability to handle big data to determine and successfully classify all the content it offers, as well as its users' tastes. As a result, it offers recommendations aimed at convincing its viewers based on no less than eighty thousand microgenres "that expand people's points of contact with the content," she explained.

"If Netflix knows that you don't like political drama, but you do like empowered women, it can recommend you watch House of Cards for the second reason," she said by way of an example. Similarly, the platform can recommend non-English-language content to someone for many reasons, based on their viewing history and algorithmic calculations, regardless of where that content has been produced. "It's like creating revolving doors, in which each person connects with different aspects of the content. And that applies to both US and local content," she concluded.

Furthermore, the success of Netflix's glocal strategy is also due to a series of very meticulous initiatives, on both a global and local scale, to carry out projects, classify audiences and encourage people to talk about its content.

 

Strengths and weaknesses

"The strength of this glocal strategy is that it has removed the barriers that used to exist in the market and which made it very difficult to buy foreign products," said Neira, who provided the example of the company's production centre in Madrid, which has become the largest Netflix production hub in Europe, and has created many jobs.

However, she pointed out that employment by the platforms is creating job insecurity, and its sustainability is uncertain: "The filming takes place very quickly. We are producing hours and hours of content that has a very short shelf life. Content is increasingly short-lived, and that leaves an economic, personal and environmental footprint."

This type of audiovisual production and consumption also tends to homogenize content, and thereby diminishes the character of what we used to call European cinema. "We are producing a lot here in Spain, but ultimately, we are producing content here with distinctive features that aren't European in the strict sense of the word," she said.

Neira also highlighted what has been termed cultural conformism: "The active search threshold is lowered considerably. I always tell people that when they decide to watch something on Netflix, they shouldn't decide what they're going to watch in Netflix itself, but instead find out about things first, away from the platform."

In any event, these are not placid times for streaming companies. Netflix is the only one that is not losing money, but it has announced lower levels of investment for this year. Neira's prediction is that "the key to the future lies in looking at the past, trying to make the content more sustainable and longer-lasting." Meanwhile, December sees the premiere of Berlin, the long-awaited prequel to Money Heist.

This research by the UOC supports Sustainable Development Goals (SDG) 9, Industry, Innovation and Infrastructure. 

 

Article reference:

Neira,  Elena;  Clares-Gavilán,  Judith;  Sánchez-Navarro,  Jordi  (2023). “Standing up to Hollywood: the Netflixglocal strategy for popularising non-English-language series worldwide”. Profesional de la información, v. 32, n. 4, e320409. https://doi.org/10.3145/epi.2023.jul.09

 

Cloud services without servers: what's behind it


Peer-Reviewed Publication

UNIVERSITY OF WÜRZBURG




In cloud computing, commercial providers make computing resources available on demand to their customers over the Internet. This service is partly offered "serverless", that is, without servers. How can that work? Computing resources without a server, isn't that like a restaurant without a kitchen?

"The term is misleading," says computer science Professor Samuel Kounev from Julius-Maximilians-Universität (JMU) Würzburg in Bavaria, Germany. Because even serverless cloud services don't get by without servers.

In classical cloud computing, for example, a web shop rents computing resources from a cloud provider in the form of virtual machines (VMs). However, the shop itself remains responsible for the management of "its" servers, that is, the VMs. It has to take care of security aspects as well as the avoidance of overload situations or the recovery from system failures.

The situation is different with serverless computing. Here, the cloud provider takes over responsibility for the complete server management. The cloud users can no longer even access the server, it remains hidden from them – hence the term "serverless".

Research article in ACM’s "Communications of the ACM” magazine

"The basic idea of serverless computing has been around since the beginning of cloud computing. However, it has not become widely accepted," explains Samuel Kounev, who heads the JMU Chair of Computer Science II (Software Engineering). But a shift can currently be observed in the industry and in science, the focus is increasingly moving towards serverless computing.

A recent article in the Communications of the ACM magazine of the Association for Computing Machinery (ACM) deals with the history, status and potential of serverless computing. Among the authors are Samuel Kounev and Dr. Nikolas Herbst, who heads the JMU research group "Data Analytics Clouds". ACM has also produced a video with Professor Samuel Kounev to accompany the publication: https://vimeo.com/849237573

Experts define serverless computing inconsistently

The origins of the research article lie two years back. In 2021, around 50 international experts in serverless computing met at a seminar at Schloss Dagstuhl – Leibniz Centre for Informatics.

"We discussed the most important developments and research questions there and found out that many of us define serverless computing differently and that contradictions sometimes arise," Kounev recounts. This is not unusual in science, he says, when a paradigm shift emerges.

In order to create clarity, a small group of researchers got together. Together, they tried to define serverless computing precisely after the seminar. Exactly this circle of people has now published the article in the renowned CACM magazine; it also includes Ian Foster, a prominent pioneer of cloud computing from the USA.

Two key principles defined

The team has defined two principles that characterise serverless computing. "NoOps" is the first, which stands for "no operations". This means, as described above, that the technical server management, including the hardware and software layers, is completely in the responsibility of the cloud provider.

The second principle is "utilisation-based billing", which means that only the time during which the customer actively uses the allocated computing resources is billed. With classic cloud services, on the other hand, a fee is paid for the entire time during which the respective cloud resources were rented.

Projects at the Chair of Samuel Kounev

At JMU, the computer scientists in Samuel Kounev's team are working, for example, on the elasticity of cloud services, especially the autoscaling of computing resources. The goal here is that the amount of computing resources allocated to a cloud application automatically adapt to increasing or decreasing demand over time.

A long-term project that Kounev's team wants to realise in the coming years is a serverless cloud platform for large workflows in Earth observation. The focus is on climate research with satellite data. Scientists should be able to use the platform to study various global effects of climate change easily, quickly and efficiently. These include changes in forests, snow cover in mountains, or biodiversity.

In this field, Kounev's team is cooperating with the JMU Chair of Remote Sensing, the German Aerospace Center (DLR), the Leibniz Computing Centre of the Bavarian Academy of Sciences and Humanities and the Max Planck Institute for Behavioural Biology (Radolfzell / Konstanz), among others.

 AND THAT WAS SURPASSED THIS YEAR

Alarming results from world first study of two decades of global smoke pollution


2 billion people exposed to at least one day of potentially health-impacting wildfire smoke, study finds


Peer-Reviewed Publication

MONASH UNIVERSITY

Professor Yuming Guo 

IMAGE: PROFESSOR YUMING GUO view more 

CREDIT: MONASH UNIVERSITY




The world’s first study of the increase in pollution from landscape fires across the globe over the past two decades reveals that over 2 billion people are exposed to at least one day of potentially health-impacting environmental hazard annually – a figure that has increased by 6.8 per cent in the last ten years.

The study highlights the severity and scale of the landscape fire-sourced air pollution, its increased impact on the world’s population and associated rise in public health risk. Exposure to fire-sourced air pollution has many adverse health impacts, including increased mortality and morbidity and a global worsening of cardiorespiratory conditions and mental health.

The study, published today (20 September) in Nature led by Australian scientists, estimated the global daily air pollution from all fires from 2000 to 2019 – finding that 2.18 billion people were exposed to at least one day of substantial landscape fire air pollution in each year, with each person in the world having on average 9.9 days of exposure per year, an increase of 2.1 per cent in the last decade. It also found that exposure levels in low-income countries were about four-fold higher than in high income countries.

Led by Professors Yuming Guo and Shanshan Li, from Monash University’s School of Population Health and Preventive Medicine, the study also found that the exposure levels of PM2.5 were particularly high in Central Africa, Southeast Asia, South America and Siberia. The study also looked at global landscape fire-sourced ozone, an important fire-related pollutant has only been estimated for United States.

In the study, landscape fires refer to any fires burning in natural and cultural landscapes, e.g. natural and planted forest, shrub, grass, pastures, agricultural lands and peri-urban areas, including both planned or controlled fires (e.g., prescribed burns, agricultural fires) and wildfires (defined as uncontrolled or unplanned fires burning in wildland vegetation).

The comprehensive assessment of the global population exposures to fire-sourced PM2.5 and ozone during 2000-2019 was calculated using a machine learning approach with inputs from chemical transport models, ground-based monitoring stations, and gridded weather data.

The recent pollution from the Canadian wildfires that spread smoke across North America highlighted the increase in severity and frequency of landscape fires due to climate change. According to Professor Guo, no study to date has looked at the long-range effect of this increase in landscape fires globally and wildfires often impact remote areas where there are few or no air quality monitoring stations. In addition, in many low-income countries, there are no air quality monitoring stations even in urban areas.

“The exposure to air pollution caused by landscape fire smoke travelling hundreds and sometimes even thousands of kilometres can affect much larger populations, and cause much larger public health risks,” he said.

“Mapping and tracking the population exposure to landscape fire-sourced air pollution are essential for monitoring and managing its health impacts, implementing targeted prevention and interventions, and strengthening arguments for mitigation of climate change.” 

  

Average source fine particulate matter 2010-2019

CREDIT

Monash University

 

Clinical trial of HIV vaccine begins in United States and South Africa


Novel vaccine includes NIH-funded technology in development since 2004


Business Announcement

NIH/NATIONAL INSTITUTE OF ALLERGY AND INFECTIOUS DISEASES

HIV-1 virus particles 

IMAGE: TRANSMISSION ELECTRON MICROGRAPH OF HIV-1 VIRUS PARTICLES FROM INFECTED H9 CELLS, PRODUCED IN CELL CULTURE. THE PARTICLES EXHIBIT TWO STAGES OF REPLICATION: THE TWO “ARCS” ARE IMMATURE PARTICLES BUDDING FROM THE PLASMA MEMBRANE OF THE CELL, AND THE CENTER SPHERICAL PARTICLE IS A MATURE FORM IN EXTRACELLULAR SPACE. IMAGE CAPTURED AT THE NIAID INTEGRATED RESEARCH FACILITY (IRF) IN FORT DETRICK, MARYLAND. view more 

CREDIT: NIAID




WHAT:
A trial of a preventive HIV vaccine candidate has begun enrollment in the United States and South Africa. The Phase 1 trial will evaluate a novel vaccine known as VIR-1388 for its safety and ability to induce an HIV-specific immune response in people. The National Institute of Allergy and Infectious Diseases (NIAID), part of the National Institutes of Health, has provided scientific and financial support throughout the lifecycle of this HIV vaccine concept and is contributing funding for this study. 

VIR-1388 is designed to instruct the immune system to produce T cells that can recognize HIV and signal an immune response to prevent the virus from establishing chronic infection. VIR-1388 uses a cytomegalovirus (CMV) vector, meaning a weakened version of CMV delivers the HIV vaccine material to the immune system without causing disease in the study participants. CMV has been present in much of the global population for centuries. Most people living with CMV experience no symptoms and are unaware that they are living with the virus. CMV remains detectable in the body for life, which suggests it has the potential to deliver and then safely help the body retain HIV vaccine material for a long period, potentially overcoming the waning immunity observed with more short-lived vaccine vectors. 

NIAID has funded the discovery and development of the CMV vaccine vector since 2004 and is funding this trial with the Bill & Melinda Gates Foundation and Vir Biotechnology, based in San Francisco. The trial is sponsored by Vir and conducted through the NIAID-funded HIV Vaccine Trials Network (HVTN) as study HVTN 142. 

HVTN 142 is taking place at six sites in the United States and four in South Africa and will enroll 95 HIV-negative participants. Participants will be randomly assigned to one of four study arms: three arms will each receive a different dose of the vaccine, and one will receive a placebo. To optimize participant safety, this study will only enroll people already living with asymptomatic CMV. Initial results are expected in late 2024, and an optional long-term sub-study will continue to follow volunteers for up to three years after their first vaccine dose.

Additional information about the trial is available on ClinicalTrials.gov under study identifier NCT05854381.

WHO:
Carl Dieffenbach, Ph.D., Director of NIAID’s Division of AIDS, is available to discuss this research.

CONTACT:
To schedule interviews, please contact the NIAID News & Science Writing Branch, (301) 402-1663, NIAIDNews@niaid.nih.gov.


NIAID conducts and supports research—at NIH, throughout the United States, and worldwide—to study the causes of infectious and immune-mediated diseases, and to develop better means of preventing, diagnosing and treating these illnesses. News releases, fact sheets and other NIAID-related materials are available on the NIAID website.

About the National Institutes of Health (NIH): NIH, the nation's medical research agency, includes 27 Institutes and Centers and is a component of the U.S. Department of Health and Human Services. NIH is the primary federal agency conducting and supporting basic, clinical, and translational medical research, and is investigating the causes, treatments, and cures for both common and rare diseases. For more information about NIH and its programs, visit http://www.nih.gov/.

NIH...Turning Discovery Into Health®


BURP

Black holes eat faster than previously expected


New finding might explain why quasars flare and fade so quickly

Peer-Reviewed Publication

NORTHWESTERN UNIVERSITY

Accretion disk 

IMAGE: A NEW STUDY SHOWS THAT, BY DRAGGING SPACE-TIME, SUPERMASSIVE BLACK HOLES CAN RIP APART THE VIOLENT WHIRLPOOL OF DEBRIS (OR ACCRETION DISKS) THAT ENCIRCLE THEM, RESULTING IN AN INNER AND OUTER SUBDISK. view more 

CREDIT: NICK KAAZ/NORTHWESTERN UNIVERSITY




A new Northwestern University-led study is changing the way astrophysicists understand the eating habits of supermassive black holes.

While previous researchers have hypothesized that black holes eat slowly, new simulations indicate that black holes scarf food much faster than conventional understanding suggests.

The study will be published on Wednesday (Sept. 20) in The Astrophysical Journal.

According to new high-resolution 3D simulations, spinning black holes twist up the surrounding space-time, ultimately ripping apart the violent whirlpool of gas (or accretion disk) that encircles and feeds them. This results in the disk tearing into inner and outer subdisks. Black holes first devour the inner ring. Then, debris from the outer subdisk spills inward to refill the gap left behind by the wholly consumed inner ring, and the eating process repeats.

One cycle of the endlessly repeating eat-refill-eat process takes mere months — a shockingly fast timescale compared to the hundreds of years that researchers previously proposed.

This new finding could help explain the dramatic behavior of some of the brightest objects in the night sky, including quasars, which abruptly flare up and then vanish without explanation.

“Classical accretion disk theory predicts that the disk evolves slowly,” said Northwestern’s Nick Kaaz, who led the study. “But some quasars — which result from black holes eating gas from their accretion disks — appear to drastically change over time scales of months to years. This variation is so drastic. It looks like the inner part of the disk — where most of the light comes from — gets destroyed and then replenished. Classical accretion disk theory cannot explain this drastic variation. But the phenomena we see in our simulations potentially could explain this. The quick brightening and dimming are consistent with the inner regions of the disk being destroyed.”

Kaaz is a graduate student in astronomy at Northwestern’s Weinberg College of Arts and Sciences and member of the Center for Interdisciplinary Exploration and Research in Astrophysics (CIERA). Kaaz is advised by paper co-author Alexander Tchekhovskoy, an associate professor of physics and astronomy at Weinberg and a CIERA member.

Mistaken assumptions

Accretion disks surrounding black holes are physically complicated objects, making them incredibly difficult to model. Conventional theory has struggled to explain why these disks shine so brightly and then abruptly dim — sometimes to the point of disappearing completely.

Previous researchers have mistakenly assumed that accretion disks are relatively orderly. In these models, gas and particles swirl around the black hole — in the same plane as the black hole and in the same direction of the black hole’s spin. Then, over a time scale of hundreds to hundreds of thousands of years, gas particles gradually spiral into the black hole to feed it.

“For decades, people made a very big assumption that accretion disks were aligned with the black hole’s rotation,” Kaaz said. “But the gas that feeds these black holes doesn’t necessarily know which way the black hole is rotating, so why would they automatically be aligned? Changing the alignment drastically changes the picture.”

The researchers’ simulation, which is one of the highest-resolution simulations of accretion disks to date, indicates that the regions surrounding the black hole are much messier and more turbulent places than previously thought.

More like a gyroscope, less like a plate

Using Summit, one of the world’s largest supercomputers located at Oak Ridge National Laboratory, the researchers carried out a 3D general relativistic magnetohydrodynamics (GRMHD) simulation of a thin, tilted accretion disk. While previous simulations were not powerful enough to include all the necessary physics needed to construct a realistic black hole, the Northwestern-led model includes gas dynamics, magnetic fields and general relativity to assemble a more complete picture.

“Black holes are extreme general relativistic objects that affect space-time around them,” Kaaz said. “So, when they rotate, they drag the space around them like a giant carousel and force it to rotate as well — a phenomenon called ‘frame-dragging.’ This creates a really strong effect close to the black hole that becomes increasingly weaker farther away.”

Frame-dragging makes the entire disk wobble in circles, similar to how a gyroscope precesses. But the inner disk wants to wobble much more rapidly than the outer parts. This mismatch of forces causes the entire disk to warp, causing gas from different parts of the disk to collide. The collisions create bright shocks that violently drive material closer and closer to the black hole.

As the warping becomes more severe, the innermost region of the accretion disk continues to wobble faster and faster until it breaks apart from the rest of the disk. Then, according to the new simulations, the subdisks start evolving independently from one another. Instead of smoothly moving together like a flat plate surrounding the black hole, the subdisks independently wobble at different speeds and angles like the wheels in a gyroscope.

“When the inner disk tears off, it will precess independently,” Kaaz said. “It precesses faster because it’s closer to the black hole and because it’s small, so it’s easier to move.”

‘Where the black hole wins’

According to the new simulation, the tearing region — where the inner and outer subdisks disconnect — is where the feeding frenzy truly begins. While friction tries to keep the disk together, the twisting of space-time by the spinning black hole wants to rip it apart.

“There is competition between the rotation of the black hole and the friction and pressure inside the disk,” Kaaz said. “The tearing region is where the black hole wins. The inner and outer disks collide into each other. The outer disk shaves off layers of the inner disk, pushing it inwards.”

Now the subdisks intersect at different angles. The outer disk pours material on top of the inner disk. This extra mass also pushes the inner disk toward the black hole, where it is devoured. Then, the black hole’s own gravity pulls gas from the outer region toward the now-empty inner region to refill it.

The quasar connection

Kaaz said these fast cycles of eat-refill-eat potentially explain so-called “changing-look” quasars. Quasars are extremely luminous objects that emit 1,000 times more energy than the entire Milky Way’s 200 billion to 400 billion stars. Changing-look quasars are even more extreme. They appear to turn on and off over the duration of months — a tiny amount of time for a typical quasar.

Although classical theory has posed assumptions for how quickly accretion disks evolve and change brightness, observations of changing-look quasars indicate that they actually evolve much, much faster.

“The inner region of an accretion disk, where most of the brightness comes from, can totally disappear — really quickly over months,” Kaaz said. “We basically see it go away entirely. The system stops being bright. Then, it brightens again and the process repeats. Conventional theory doesn’t have any way to explain why it disappears in the first place, and it doesn’t explain how it refills so quickly.”

Not only do the new simulations potentially explain quasars, they also could answer ongoing questions about the mysterious nature of black holes.

“How gas gets to a black hole to feed it is the central question in accretion-disk physics,” Kaaz said. “If you know how that happens, it will tell you how long the disk lasts, how bright it is and what the light should look like when we observe it with telescopes.”

The study, “Nozzle shocks, disk tearing and streamers drive rapid accretion in 3D GRMHD simulations of warped thin disks,” was supported by the U.S. Department of Energy and the National Science Foundation.

This still from a simulation shows how a supermassive black hole's accretion disk can rip into two subdisks, which are misaligned in this image.

CREDIT

Nick Kaaz/Northwestern University

 

Impact of Child Trust Funds in the UK ‘a missed opportunity’ – Aston University research for UK Savings Week


The research calls for the need to revisit how children can be helped to develop an effective savings habit

Peer-Reviewed Publication

ASTON UNIVERSITY




Researchers from Aston and Lincoln Universities have conducted a comprehensive study to evaluate the effectiveness of Child Trust Funds (CTFs) in the United Kingdom. 

Connected with UK Savings Week (18 – 24 September) it highlights how developing children’s savings remains an area in need of further improvement if it is to play any part in improving the UK’s financial resilience long term – a key part of the current UK Personal Financial Wellbeing strategy.

The findings from the Centre for Personal Financial Wellbeing at Aston University provide insights into the impact of this large scale, national savings scheme seeking to develop children's savings and the potential for asset-based welfare policies. 

Researchers say this was one of the biggest experiments of its type, exploring how a country could make a sea change in how the next generation can be helped to develop their saving behaviour – a critical issue of financial resilience for all nations that the Covid pandemic has exposed.  

The Child Trust Funds programme was an initiative introduced in 2002 under the Labour government aimed at generating a capital sum for UK children when they turn 18 from a pot of at least £250 given to all children at birth. This sum, and any other savings that parents and others also contributed, was to be invested for them to grow into a fund they could draw on as they turned 18. The programme reached a significant milestone in late 2020 when the first cohort reached adulthood. 

The research shows that by 2022, £10bn had been saved in CTFs in more than six million accounts. £2bn of this was the original funds input by the UK Government. The programme was ceased for children born after 2011 and no new Government money has been applied to these funds since this time, significantly curtailing the impact of this programme. 

The research explored whether there was evidence that the programme met its two primary goals: higher levels of savings for eligible children than would otherwise have been likely to occur and helping to instil an improved savings habit in both young people as fund recipients and their parents. 

Professor Andy Lymer, director of the Centre for Personal Financial Wellbeing at Aston University, said:

“While the Child Trust Fund has shown modest positive impacts on savings for eligible children, it would be challenging to claim that the initial exciting design objectives and potential have been achieved. This is arguably therefore a real missed opportunity to have material effect on savings and savings related behaviour in the next generation.  

“We found a small increase in saved sums on average, potentially amounting to around £200 more than the average savings for CTF-ineligible siblings – although some got lots more than this from this programme, with average pot sizes by April 2022 being just over £2,000.

“The study however highlighted that a considerable portion of accounts remained untouched, with 27% of eligible accounts remaining unclaimed by August 2022 (latest data available), despite a targeted national advertising campaign aimed at eligible 18-year-olds to remind them they have accounts that may have been forgotten about. This means that continued effort is going to be needed annually by in the financial services sector managing these funds, and the Government more generally, to make sure these funds get connected with the owners of these funds as they turn 18 to make sure these funds can have at least some positive impact on their recipients’ lives hopefully.”

Professor Steve McKay, distinguished professor in social research in the College of Social Science at the University of Lincoln and a co-author of the report, said:

“The results indicated there was a small positive effect on savings on average, but that the benefits were primarily associated with children from more affluent families who were able to continue to add to the children’s funds creating extra compounding growth. 

“Furthermore, it illustrated that while net extra savings were created for most account holders over their ineligible siblings, the study shows that parents aren't compensating for their siblings in other savings.

“There was very limited evidence of a meaningful impact on creating savings habit overall, for children or parents. In fact, many parents had forgotten or lost track of the accounts over years and so their children had little idea these even existed. 

“Recognising the importance of even modest savings for many families, particularly considering that a majority of children in the UK have no savings at all, the research calls for a revisit of the principles on which this programme was founded. It suggests we urgently need to revisit how we can effectively help children develop savings, and effective savings habit, with and without government support - particularly those least likely to develop it via other means.”

You read the full report here.