Saturday, December 21, 2024

 

One in five young Spaniards is an at-risk consumer of TikTok, spending more than two hours a day on the app, according to a study by UPF and the UOC



More girls (24.37%) than boys (15.45%) spend more than two hours a day on TikTok, above the recommended threshold to avoid risks to cognitive and emotional well-being.




Universitat Pompeu Fabra - Barcelona





Up to one in five young Spaniards spends more than two hours a day on TikTok and exceeds the recommended threshold for the use of social networks, as of which the risk of suffering mental health problems may increase. The proportion of girls above this risk threshold (24.37%) is clearly higher than that of boys (15.45%). This is revealed by a study led by Pompeu Fabra University (UPF), in collaboration with the Open University of Catalonia (UOC), recently published in open access in the journal Nature.

While previous studies had already analysed the impact of social media in general on young people’s mental health, this research is a pioneer in the analysis of the specific effects of TikTok on adolescents’ digital well-being. TikTok has specific features compared to other networks (such as Instagram, X or Facebook), especially because it promotes more passive consumption of videos and less interaction between users. In addition, it is the most popular social network among young people around the world, especially in Spain, the state where it is consumed by the highest percentage of teenagers.

For this study, a survey was conducted on 1,043 young people from all over Spain between the ages of 12 and 18 to examine the amount of time they spend on Tiktok and the types of content they view. Their perception of their own digital well-being was also assessed. Digital well-being is understood as the state of optimal balance between connection time and cognitive and emotional well-being. In this study, three variables were measured in relation to digital well-being: the young people’s ability to set their own connection time limits; the ability to generate social and community connections; and the capacity for emotional resilience.

The principal investigator of the research is Mònika Jiménez, of the Communication, Advertising and Society (CAS) research group of the UPF Department of Communication, who led the research together with Mireia Montaña, from the Learning, Media and Entertainment research group (GAME) of Information and Communication Sciences Studies at the UOC. On behalf of UPF, Clara Virós, lead author of the article published in Nature, also participated.

More than half of young people spend more than an hour a day on TikTok

Regarding the time spent on Tiktok, the research finds that more than half of the young people surveyed (53.19%) spend more than an hour a day on the app, 35.28% more than an hour and a half, and 20.22% more than two hours, with a significantly higher percentage of girls in the latter case (24.37%). In this sense, several previous studies have found that using social networks for more than two hours a day is associated with low self-esteem in terms of body image, a negative perception of one’s own mental health, and an increased risk of psychological stress or suicidal thoughts.

The time spent by young people on TikTok contrasts with the self-perception of their level of digital well-being, which is generally positive. On a scale of 1 to 5, they rate their ability to set limits on consumption time with a score of 3.22; their emotional resilience in view of the content watched, 3.31; and their ability to generate social and community connections, 3.64.

The longer they spend on TikTok, the less young people’s ability to set app consumption limits

However, Mònika Jiménez (UPF) warns: “the longer young people spend in Tiktok, the less able they are to set connection time limits on the app”. Young people who spend more than two hours a day on TikTok rate their ability to set limits with a score of 2.93 out of 5, while those who spend between half an hour and an hour on it award 3.33. Those who spend between 11 minutes and half an hour (awarding 3.47) or those who spend less than 10 minutes (3.53) are most able to limit their consumption time.

Content viewed on TikTok by boys and girls reveals the persistence of traditional gender roles

The research also reveals significant differences in the types of content consumed on TikTok by teenage boys and girls, as Mireia Montaña (UOC) explains: “boys are predominantly interested in video game and professional sports content, while girls tend to consume more beauty and fashion content, thus showing a persistence of traditional gender roles in digital consumption preferences”.

Regarding the type of content consumed, the adolescents surveyed rated from 1 (never) to 5 (always) the frequency with which they watch videos of different types. Beyond comical and music videos, which are among the three most watched by both sexes, the rest of the content most consumed by boys and girls differs. In the case of girls, the five most watched contents are: comedy (3.24), music (3.22), fashion (3.02), beauty (3) and dance or people doing playback (2.88). For boys, they are comedy (3.50), video games (3.19), music (3.06), professional sports (3.01) and news from influencers or streamers (2.92).

Young people’s digital well-being is not just a question of parental control: educational programmes and periodic audits of TikTok algorithms are needed

In the light of this situation, the study considers that measures to improve the digital well-being of young people cannot be limited to parental control of the app or digital disconnection. Educational programmes are also needed to promote healthy digital habits among young people and to provide their families with more support strategies to enable this, with a gender perspective. The study warns that indiscriminate restrictive measures do not work, rather there is a need to encourage each young person to maintain a level of moderate consumption of social networks, in keeping with their interests and needs. It also insists that regular audits of the algorithms of networks like TikTok should be evaluated to prevent their potentially addictive effects.

This study is part of the project “Spanish adolescents as recipients and creators of content on mental health in social networks. Discourse, incidence and the proposal of tools for digital literacy on psychological disorders and their stigma”, funded by the Spanish Ministry of Science, Innovation and Universities.

Reference article:

Virós-Martín, C., Montaña-Blasco, M. & Jiménez-Morales, M. Can’t stop scrolling! Adolescents’ patterns of TikTok use and digital well-being self-perception. Humanit Soc Sci Commun 11, 1444 (2024). https://doi.org/10.1057/s41599-024-03984-5

 

Dripstones offer insights into climate dynamics in Europe



Geoscientists study stalagmites in Romanian cave to reconstruct regional precipitation patterns



Heidelberg University




Investigations into precipitation patterns in eastern Central Europe since the end of the last ice age, conducted by an international research team led by Dr Sophie Warken of Heidelberg University, have shown that dynamic processes in atmospheric circulation, such as the North Atlantic jet stream, influence regional changes in precipitation. The researchers analyzed dripstones from the Cloşani Cave in Romania, which act as a natural climate archive that allows conclusions to be drawn about precipitation variability over a period of approximately 20,000 years. According to Dr Warken, the new findings on the dynamics of the climate in Europe could contribute to improving current climate models and the ability to more accurately predict the likelihood of extreme weather events.

Of particular importance for the regional weather and precipitation patterns in Europe’s mid-latitudes is the North Atlantic jet stream, an atmospheric air flow that crosses the North Atlantic in a southwest-to-northeast direction carrying precipitation to Europe. In the past, climatic changes did affect the strength and trajectory of the jet stream – that much is known. But as Dr Warken explains, our understanding of how climate-induced fluctuations in the jet stream influenced local and regional precipitation patterns in Europe is limited.

Natural climate archives, like the dripstones in the Cloşani Cave in Romania, can provide information on the climate dynamics of bygone ages. The lime deposits, also known as speleothems, form from precipitation that seeps into the cave’s interior over several thousands of years. Geochemical investigations allow conclusions to be drawn about the chronology of the deposits and thus past environmental conditions and precipitation amounts. The current study focused on three stalagmites that contain information on the hydroclimatic conditions in eastern Central Europe over the past 20,000 years.

The results show how the trajectory of the North Atlantic jet stream changed due to the warming and melting of the ice sheets of the Northern hemisphere – a process that lasted until about 5,000 years ago. As a result, precipitation in the late last Ice Age about 20,000 years ago and the early to mid-Holocene – the current interglacial epoch that followed the last Ice Age about 7,000 years ago – was 20 to 30 percent higher than it is today. It turns out that precipitation variability in the region over comparatively short time periods of centuries or even just decades fluctuated irrespective of long-term temperature developments in the North Atlantic region.

“Our research shows that dynamic processes in particular, such as changes in wind patterns and atmospheric currents like the jet stream, influence the precipitation and weather patterns in Central Europe,” stresses Dr Warken. This helps fill in a research gap, she explains, because former reconstructions mainly refer to thermodynamic processes, i.e., the warming of the atmosphere, and thereby directly connect a rise in temperature to an increase in precipitation. These reconstructions are often based on climate models still fraught with uncertainties in simulating local and regional precipitation patterns.

“Climate change is already leading to more frequent and intense precipitation events; based on current prognoses, the number of extreme weather events and heavy rainfall in several regions in Europe will continue to rise,” states Dr Warken. A better understanding of the underlying dynamic processes is key to more accurately predict future precipitation patterns and the likelihood of extreme weather events. Against this backdrop, the current results from the Cloşani Cave can help improve the accuracy of climate models and prognoses, adds the geoscientist, who together with her research group at the Institutes of Earth Sciences and Environmental Physics of Heidelberg University is reconstructing the climate of past millennia.

The research was carried out in collaboration with scientists from the Universities of Mainz and Innsbruck (Austria). Also involved were other institutions in Germany and Romania. The results were published in the journal “Communications Earth & Environment”.

 

ESA and NASA satellites deliver first joint picture of Greenland Ice Sheet melting



Academics from Northumbria University are part of an international research team which has used data from satellites to track changes in the thickness of the Greenland Ice Sheet.



Peer-Reviewed Publication

Northumbria University

Greenland Ice Sheet Elevation Change from CryoSat and ICESat-2 

video: 

Animation showing where the Greenland Ice Sheet is thinning using data from two satellites

view more 

Credit: Centre for Polar Observation and Modelling, Northumbria University





Academics from Northumbria University are part of an international research team which has used data from satellites to track changes in the thickness of the Greenland Ice Sheet.

Global warming is causing the Ice Sheet to melt and flow more rapidly, raising sea levels and disturbing weather patterns across our planet.

Because of this, precise measurements of its changing shape are of critical importance for tracking and adapting to the effects of climate warming.

Scientists have now delivered the first measurements of Greenland Ice Sheet thickness change using CryoSat-2 and ICESat-2 – the ESA and NASA ice satellite missions.

Both satellites carry altimeters as their primary sensor, but they make use of different technologies to collect their measurements.

CryoSat-2 carries a radar system to determine the Earth’s surface height, while ICESat-2 has a laser system for the same task.

Although radar signals can pass through clouds, they also penetrate into the ice sheet surface and have to be adjusted for this effect.

Laser signals, on the other hand, reflect from the actual surface, but they cannot operate when clouds are present.

The missions are therefore highly complementary, and combining their measurements has been a holy grail for polar science.

A new study from scientists at the UK Centre for Polar Observation and Modelling (CPOM), based at Northumbria University, and published in Geophysical Research Letters shows that CryoSat-2 and ICESat-2 measurements of Greenland Ice Sheet elevation change agree to within 3%.

This confirms that the satellites can be combined to produce a more reliable estimate of ice loss than either could achieve alone. It also means that if one mission were to fail, the other could be relied upon to maintain our record of polar ice change.

Between 2010 and 2023, the Greenland Ice Sheet thinned by 1.2 metres on average. However, thinning across the ice sheet’s margin (the ablation zone) was over five times larger, amounting to 6.4 metres on average.

The most extreme thinning occurred at the ice sheets outlet glaciers, many of which are speeding up.

At Sermeq Kujalleq in west central Greenland (also known as Jakobshavn Isbræ), peak thinning was 67 metres, and at Zachariae Isstrøm in the northeast peak thinning was 75 metres.

Altogether, the ice sheet shrank by 2,347 cubic kilometres across the 13-year survey period – enough to fill Africa’s Lake Victoria.

The biggest changes occurred in 2012 and 2019 when summer temperatures were extremely hot and the ice sheet lost more than 400 cubic kilometres of its volume each year.

Greenland’s ice melting also affects global ocean circulation and disturbs weather patterns. These changes have far-reaching impacts on ecosystems and communities worldwide.

The availability of accurate, up-to-date data on ice sheet changes will be critical in helping us to prepare for and adapt to the impacts of climate change.

Lead author and CPOM researcher Nitin Ravinder said: “We are very excited to have discovered that CryoSat-2 and ICESat-2 are in such close agreement.

“Their complementary nature provides a strong motivation to combine the data sets to produce improved estimates of ice sheet volume and mass changes.

“As ice sheet mass loss is a key contributor to global sea level rise, this is incredibly useful for the scientific community and policymakers.”

The study made use of four years of measurements from both missions, including those collected during the Cryo2ice campaign, a pioneering ESA-NASA partnership initiated in 2020.

By adjusting CryoSat-2’s orbit to synchronise with ICESat-2, ESA enabled the near-simultaneous collection of radar and laser data over the same regions.

This alignment allows scientists to measure snow depth from space, offering unprecedented accuracy in tracking sea and land ice thickness.

Tommaso Parrinello, CryoSat Mission Manager at ESA, expressed optimism about the campaign’s impact:

“CryoSat has provided an invaluable platform for understanding our planet’s ice coverage over the past 14 years, but by aligning our data with ICESat-2, we’ve opened new avenues for precision and insight.

“This collaboration represents an exciting step forward, not just in terms of technology but in how we can better serve scientists and policymakers who rely on our data to understand and mitigate climate impacts.”

Thorsten Markus, project scientist for the ICESat-2 mission at NASA, said: “It is great to see that the data from ‘sister missions’ are providing a consistent picture of the changes going on in Greenland.

“Understanding the similarities and differences between radar and lidar ice sheet height measurements allows us to fully exploit the complementary nature of those satellite missions.

“Studies like this are critical to put a comprehensive time series of the ICESat, CryoSat-2, ICESat-2, and, in the future, CRISTAL missions together.”

ESA’s CryoSat-2 continues to be instrumental in our understanding of climate related changes in polar ice, working alongside NASA’s ICESat-2 to provide robust, accurate data on ice sheet changes.

Together, these missions represent a significant step forward in monitoring polar ice loss and preparing for its global consequences.

CPOM is a partnership of six universities and the British Antarctic Survey (BAS), based at Northumbria University, primarily funded by the National Environment Research Council (NERC) to provide national capability in observation and modelling of the processes that occur in the Polar regions of the Earth.

CPOM uses satellite observations to monitor change in the Polar regions and numerical models to better predict how their ice and oceans might evolve in the future.

By providing long-term capabilities to the scientific community and leading international assessments, CPOM helps global policymakers plan for the effects of climate change and sea level rise.

  

Greenland Ice Sheet

Credit

Prof Andrew Shepherd

 

AI-driven approach reveals hidden hazards of chemical mixtures in rivers




University of Birmingham





Artificial intelligence can provide critical insights into how complex mixtures of chemicals in rivers affect aquatic life – paving the way for better environmental protection. 

A new approach, developed by researchers at the University of Birmingham, demonstrates how advanced artificial intelligence (AI) methods can help identify potentially harmful chemical substances in rivers by monitoring their effects on tiny water fleas (Daphnia).  

The team worked with scientists at the Research Centre for Eco-Environmental Sciences (RCEES), in China, and the Hemholtz Centre for Environmental Research (UFZ), in Germany, to analyse water samples from the Chaobai River system near Beijing. This river system is receiving chemical pollutants from a number of different sources, including agricultural, domestic and industrial. 

Professor John Colbourne is the director of the University of Birmingham’s Centre for Environmental Research and Justice and one of the senior authors on the paper. He expressed optimism that, by building upon these early findings, such technology can one day be deployed to routinely monitor water for toxic substances that would otherwise be undetected.  

He said: “There is a vast array of chemicals in the environment. Water safety cannot be assessed one substance at a time. Now we have the means to monitor the totality of chemicals in sampled water from the environment to uncover what unknown substances act together to produce toxicity to animals, including humans.”   

The results, published in Environmental Science and Technology, reveal that certain mixtures of chemicals can work together to affect important biological processes in aquatic organisms, which are measured by their genes. The combinations of these chemicals create environmental hazards that are potentially greater than when chemicals are present individually. 

The research team used water fleas (Daphnia) as test organisms in the study because these tiny crustaceans are highly sensitive to water quality changes and share many genes with other species, making them excellent indicators of potential environmental hazards. 

"Our innovative approach leverages Daphnia as the sentinel species to uncover potential toxic substances in the environment," explains Dr Xiaojing Li, of the University of Birmingham (UoB) and the lead author of this study. "By using AI methods, we can identify which subsets of chemicals might be particularly harmful to aquatic life, even at low concentrations that wouldn't normally raise concerns." 

Dr Jiarui Zhou, also at the University of Birmingham and co-first author of the paper, who led the development of the AI algorithms, said: “Our approach demonstrates how advanced computational methods can help solve pressing environmental challenges. By analysing vast amounts of biological and chemical data simultaneously, we can better understand and predict environmental risks." 

Professor Luisa Orsini, another senior author of the study, added: “The study's key innovation lies in our data-driven, unbiased approach to uncovering how environmentally relevant concentrations of chemical mixtures can cause harm. This challenges conventional ecotoxicology and paves the way to regulatory adoption of the sentinel species Daphnia, alongside new approach methodologies.” 

Dr Timothy Williams of the University of Birmingham and co-author of the paper also noted that: “Typically, aquatic toxicology studies either use a high concentration of an individual chemical to determine detailed biological responses or only determine apical effects like mortality and altered reproduction after exposure to an environmental sample. However, this study breaks new ground by allowing us to identify key classes of chemicals that affect living organisms within a genuine environmental mixture at relatively low concentration while simultaneously characterising the biomolecular changes elicited.” 

The findings could help improve environmental protection by: 

    •    Identifying previously unknown chemical combinations that pose risks to aquatic life 

    •    Enabling more comprehensive environmental monitoring 

    •    Supporting better-informed regulations for chemical discharge into waterways 

This research was funded by the Royal Society International Collaboration Award, the European Union's Horizon 2020 research and innovation programme, and the Natural Environmental Research Council Innovation People programme. 

ENDS

 

Modern AI systems have achieved Turing's vision, but not exactly how he hoped



Intelligent Computing
Comparison of Turing’s original test with modern Turing-like AI evaluation 

image: 

On the left, Turing’s original test involves a human interrogator (C) trying to identify a machine (A) that imitates a human assistant (B). On the right, the modern Turing-like test replaces the human interrogator with a machine (C) that rigorously evaluates the abilities of another AI system (A), supported by a knowledge graph (B). In both scenarios, the gray-colored players challenge the white-colored machine.

view more 

Credit: Bernardo Gonçalves.




A recent perspective published Nov. 13 in Intelligent Computing, a Science Partner Journal, asserts that today's artificial intelligence systems have finally realized Alan Turing's vision from over 70 years ago: machines that can genuinely learn from experience and engage in human-like conversation. Authored by Bernardo Gonçalves from the University of São Paulo and University of Cambridge, the paper also sheds light on how current energy-hungry transformer-based systems contrast with Turing's prophecy of machines that would develop intelligence naturally, like human children.

Gonçalves' paper points out that transformers, the foundation of modern generative AI systems, have provided what Turing considered "adequate proof" of machine intelligence. These systems, based on "attention mechanisms" and vast-scale learning, can now perform tasks once exclusive to human intellect, such as generating coherent text, solving complex problems, and even discussing abstract ideas.

"Without resorting to preprogramming or special tricks, their intelligence grows as they learn from experience, and to ordinary people, they can appear human-like in conversation," writes Gonçalves. "This means that they can pass the Turing test and that we are now living in one of many possible Turing futures where machines can pass for what they are not."

This achievement traces back to Turing's 1950 concept of the "imitation game," in which a machine would attempt to mimic a human in a remote conversation, deceiving a non-expert judge. The test became a cornerstone of artificial intelligence research, with early AI pioneers John McCarthy and Claude Shannon considering it the "Turing definition of thinking" and Turing’s "strong criterion." Popular culture, too, undeniably reflects Turing’s influence: the HAL-9000 computer in the Stanley Kubrick film 2001: A Space Odyssey famously passed the Turing test with ease.

However, the paper underscores that Turing’s ultimate goal was not simply to create machines that could trick humans into thinking they were intelligent. Instead, he envisioned "child machines" modeled on the natural development of the human brain—systems that would grow and learn over time, ultimately becoming powerful enough to have a meaningful impact on society and the natural world.

The paper highlights concerns about current AI development. While Turing advocated for energy-efficient systems inspired by the natural development of the human brain, today's AI systems consume massive amounts of computing power, raising sustainability concerns. Additionally, the paper draws attention to Turing's ahead-of-his-time societal warnings. He cautioned that automation should affect all levels of society equally, not just displace lower-wage workers while benefiting only a small group of technology owners—an issue that resonates strongly with current debates about AI's impact on employment and social inequality.

Looking ahead, the paper calls for Turing-like AI testing that would introduce machine adversaries and statistical protocols to address emerging challenges such as data contamination and poisoning. These more rigorous evaluation methods will ensure AI systems are tested in ways that reflect real-world complexities, aligning with Turing’s vision of sustainable and ethically guided machine intelligence.

 

Construction materials and household items are a part of a long-term carbon sink called the “technosphere”




Cell Press




We know a lot about how much fossil-derived carbon is emitted to the atmosphere but less about how much is stored in human-made products. In a study publishing December 20 in the Cell Press journal Cell Reports Sustainability, ecological economists estimate that each year, humans add around 400 million tons of fossil carbon to long-lasting products such as plastics, buildings, and human infrastructure. Although these products could be considered a “carbon sink,” proper waste management is essential to prevent them from becoming environmental hazards.

“We have accumulated more carbon in human-made stuff on the planet than there is carbon in the natural world, but we completely overlook it, and those stocks get bigger and bigger,” says ecological economist and senior author Klaus Hubacek (@KlausHubacek) of the University of Groningen.  “The message is to look at stocks rather than just flows.”

Little is known about the stocks of fossil carbon in the “technosphere”—the sum of all human-made artifacts, both in use and discarded. To estimate these stocks, and how they change from year to year, the researchers used publicly available data on the material inputs and outputs from different economic sectors globally for 2011 (the only year for which such material data exist at the global level).

Then, they calculated the amount of carbon flowing in and out of different sectors by using the average carbon content of different products—for example, plastics are estimated to contain 74% fossil carbon on average. The analysis considered not only final products, such as durable plastics and bitumen, but also fossil carbon-based feedstocks that are used as intermediate products in different industries.

They found that in 2011, 9% of extracted fossil carbon was accumulated in long-lasting products within the technosphere—if this same amount of carbon was emitted as CO2, it would almost equal EU emissions that year (3.7 Gt vs. 3.8 Gt of emitted CO2). Construction of buildings and infrastructure accounted for the highest accumulation of fossil carbon (34%). In terms of the type of products, rubber and plastic products accounted for 30% of the accumulated fossil carbon, followed by bitumen (24%; a product used in roads and roofing), and machinery and equipment (16%).

Next, the team extrapolated their 2011 findings to estimate how much fossil carbon flowed into the technosphere between 1995 and 2019 using monetary data from that time span. Overall, they estimated that 8.4 billion tons of fossil carbon were added to the technosphere between 1995 and 2019, equating around 93% of global CO2 emissions in 2019. The amount of carbon entering the technosphere increased yearly from 1995 to 2019.

Many of these fossil carbon-based products end up in landfills or as litter and take decades to centuries to degrade. Based on the average lifetime of buildings, infrastructure, and other products, the researchers estimate that 3.7 billion tons of fossil carbon were disposed of during that period—1.2 tons were brought to landfills, 1.2 tons were incinerated, 1.1 tons were recycled, and the remainder ended up as litter.

“On the one hand, you can consider it as a form of carbon sequestration if this fossil carbon ends up sequestered in landfill, but on the other hand, it poses an environmental hazard, and if you burn it, you increase carbon emissions,” says coauthor and ecological economist Franco Ruzzenenti of the University of Groningen.

Increasing product lifetime and recycling rates are two ways to reduce the amount of fossil carbon entering waste streams, the researchers say. They also emphasize the importance of enacting policies to minimize the discharge of waste from landfills.

Looking ahead, the team plans to conduct a similar analysis of biogenic carbon (i.e., carbon derived from plant materials).

“For the next step, we plan to investigate the long-term potential of biogenic carbon sequestration in durables,” says first author Kaan Hidiroglu of the University of Groningen. “This will allow us to assess whether diversifying carbon sequestration strategies, such as relying on biogenic carbon in durables such as wood materials for construction, could be a viable option.”

###

Cell Reports Sustainability, Hidiroglu et al., “The extent and fate of fossil carbon accumulation in our technosphere” https://cell.com/cell-reports-sustainability/fulltext/S2949-7906(24)00426-9

Cell Reports Sustainability (@CellRepSustain), published by Cell Press, is a monthly gold open access journal that publishes high-quality research and discussion that contribute to understanding and responding to environmental, social-ecological, technological, and energy- and health-related challenges. Visit https://www.cell.com/cell-reports-sustainability/home. To receive Cell Press media alerts, contact press@cell.com.

 

First demonstration of quantum teleportation over busy Internet cables



Advance opens door for lightning-fast quantum applications without specialized infrastructure



Northwestern University




Northwestern University engineers are the first to successfully demonstrate quantum teleportation over a fiberoptic cable already carrying Internet traffic.

The discovery introduces the new possibility of combining quantum communication with existing Internet cables — greatly simplifying the infrastructure required for distributed quantum sensing or computing applications.

The study will be published on Friday (Dec. 20) in the journal Optica.

“This is incredibly exciting because nobody thought it was possible,” said Northwestern’s Prem Kumar, who led the study. “Our work shows a path towards next-generation quantum and classical networks sharing a unified fiberoptic infrastructure. Basically, it opens the door to pushing quantum communications to the next level.”

An expert in quantum communication, Kumar is a professor of electrical and computer engineering at Northwestern’s McCormick School of Engineering, where he directs the Center for Photonic Communication and Computing.

Only limited by the speed of light, quantum teleportation could make communications nearly instantaneous. The process works by harnessing quantum entanglement, a technique in which two particles are linked, regardless of the distance between them. Instead of particles physically traveling to deliver information, entangled particles exchange information over great distances — without physically carrying it.

“In optical communications, all signals are converted to light,” Kumar explained. “While conventional signals for classical communications typically comprise millions of particles of light, quantum information uses single photons.”

Before Kumar’s new study, conventional wisdom suggested that individual photons would drown in cables filled with the millions of light particles carrying classical communications. It would be like a flimsy bicycle trying to navigate through a crowded tunnel of speeding heavy-duty trucks.

Kumar and his team, however, found a way to help the delicate photons steer clear of the busy traffic. After conducting in-depth studies of how light scatters within fiberoptic cables, the researchers found a less crowded wavelength of light to place their photons. Then, they added special filters to reduce noise from regular Internet traffic.

“We carefully studied how light is scattered and placed our photons at a judicial point where that scattering mechanism is minimized,” Kumar said. “We found we could perform quantum communication without interference from the classical channels that are simultaneously present.”

To test the new method, Kumar and his team set up a 30 kilometer-long fiberoptic cable with a photon at either end. Then, they simultaneously sent quantum information and regular Internet traffic through it. Finally, they measured the quality of the quantum information at the receiving end while executing the teleportation protocol by making quantum measurements at the mid-point. The researchers found the quantum information was successfully transmitted — even with busy Internet traffic whizzing by.

Next, Kumar plans to extend the experiments over longer distances. He also plans to use two pairs of entangled photons — rather than one pair — to demonstrate entanglement swapping, another important milestone leading to distributed quantum applications. Finally, his team is exploring the possibility of carrying out experiments over real-world inground optical cables rather than on spools in the lab. But, even with more work to do, Kumar is optimistic.

“Quantum teleportation has the ability to provide quantum connectivity securely between geographically distant nodes,” Kumar said. “But many people have long assumed that nobody would build specialized infrastructure to send particles of light. If we choose the wavelengths properly, we won’t have to build new infrastructure. Classical communications and quantum communications can coexist.”

The study, “Quantum teleportation coexisting with classical communications in optical fiber,” was supported by the U.S. Department of Energy (grant number DE-AC02-07CH11359).