Tuesday, May 27, 2025

 

Is the ocean getting darker?


New research found 21% of the global ocean had experienced a reduction in the depth of its lit zones, which are home to 90% of all marine life, during the past 20 years



University of Plymouth

Shifts in the global photic zones 

image: 

A world map showing changes in global photic zones between 2003 and 2022. Reds indicate regions where the oceans are getting darker, while blues indicate regions where oceans are getting lighter and white indicates regions where there was no statistically significant change over the period.

view more 

Credit: University of Plymouth





More than one-fifth of the global ocean – an area spanning more than 75million sq km – has been the subject of ocean darkening over the past two decades, according to new research.

Ocean darkening occurs when changes in the optical properties of the ocean reduce the depth of its photic zones, home to 90% of all marine life and places where sunlight and moonlight drive ecological interactions.

For the new study, published in Global Change Biology, researchers used a combination of satellite data and numerical modelling to analyse annual changes in the depth of photic zones all over the planet.

They found that between 2003 and 2022, 21% of the global ocean – including large expanses of both coastal regions and the open ocean – had become darker.

In addition to this, more than 9% of the ocean – an area of more than 32million sq km, similar in size to the continent of Africa – had seen photic zone depths reducing by more than 50metres, while 2.6% saw the photic zone reduced by more than 100m.

However, the picture is not solely of a darkening ocean with around 10% of the ocean – more than 37million sq km – becoming lighter over the past 20 years.

While the precise implications of the changes are not wholly clear, the researchers say it could affect huge numbers of the planet’s marine species and the ecosystem services provided by the ocean as a whole.

The study was conducted by researchers from the University of Plymouth and Plymouth Marine Laboratory, who have spent more than a decade examining the impact of artificial light at night (ALAN) on the world’s coasts and oceans.

They say that is not directly connected to ocean darkening, however, with the changes likely being as a result of a combination of nutrient, organic material and sediment loading near the coasts, caused by factors such as agricultural runoff and increased rainfall.

In the open ocean, they believe it will be down to factors such as changes in algal bloom dynamics and shifts in sea surface temperatures, which have reduced light penetration into surface waters.

Dr Thomas Davies, Associate Professor of Marine Conservation at the University of Plymouth, said: “There has been research showing how the surface of the ocean has changed colour over the last 20 years, potentially as a result of changes in plankton communities. But our results provide evidence that such changes cause widespread darkening that reduces the amount of ocean available for animals that rely on the sun and the moon for their survival and reproduction. We also rely on the ocean and its photic zones for the air we breathe, the fish we eat, our ability to fight climate change, and for the general health and wellbeing of the planet. Taking all of that into account, our findings represent genuine cause for concern.”

Professor Tim Smyth, Head of Science for Marine Biogeochemistry and Observations at the Plymouth Marine Laboratory, added: “The ocean is far more dynamic than it is often given credit for. For example, we know the light levels within the water column vary massively over any 24-hour period, and animals whose behaviour is directly influenced by light are far more sensitive to its processes and change. If the photic zone is reducing by around 50m in large swathes of the ocean, animals that need light will be forced closer to the surface where they will have to compete for food and the other resources they need. That could bring about fundamental changes in the entire marine ecosystem.”

Assessing changes in the ocean’s photic zones

To assess changes in the photic zone, the researchers used data from NASA’s Ocean Colour Web, which breaks the global ocean down into a series of 9km pixels.

This satellite derived data enabled them to observe changes on the ocean surface for each of these pixels, while an algorithm developed to measure light in sea water was used to define the depth of the photic zone in each location.

They also used solar and lunar irradiance models to examine particular changes that might impact marine species during daylight and moonlight conditions, demonstrating that changes in photic zone depth at night were small compared to daytime, but remained ecologically important.

A shifting global picture of ocean change

The most prominent changes in photic zone depth in the open ocean were observed at the top of the Gulf Stream, and around both the Arctic and Antarctic, areas of the planet experiencing the most pronounced shifts as a result of climate change.

Darkening is also widespread in coastal regions and enclosed seas – such as the Baltic Sea – where rainfall on land brings sediment and nutrients into the sea, stimulating plankton growth and reducing light availability.

 

Joining the dots for better health surveillance



King Abdullah University of Science & Technology (KAUST)





Combining data across mismatched maps is a key challenge in global health and environmental research. A powerful modeling approach has been developed to enable faster and more accurate integration of spatially misaligned datasets, including air pollution prediction and disease mapping[1].

Datasets describing important socio-environmental factors, such as disease prevalence and pollution, are collected on a variety of spatial scales. These range from point data values for specific locations up to areal or lattice data, where values are aggregated over regions as large as countries.

Merging these geographically inconsistent datasets is a surprisingly difficult technical challenge, embraced by biostatistician Paula Moraga and her Ph.D. student Hanan Alahmadi at KAUST.

“Our group develops innovative methods for analyzing the geographical and temporal patterns of diseases, quantifying risk factors, and enabling the early detection of disease outbreaks,” says Moraga. “We need to combine spatial data that are available at different resolutions, such as pollutant concentrations measured at monitoring stations and by satellites, and health data reported at different administrative boundary levels.”

Alahmadi and Moraga developed their new model through a Bayesian approach, which is often used to integrate large spatial datasets. Bayesian inference is usually performed using Markov chain Monte Carlo (MCMC) algorithms, which explore datasets through a ‘random walk’. The algorithms decide on each next step based on the previous one until they get as close as possible to a target (or ‘posterior’) distribution. However, MCMC can take up a lot of computational time, so the researchers used a different framework called the Integrated Nested Laplace Approximation (INLA).

“Unlike MCMC, which relies on sampling, INLA uses deterministic approximations to estimate posterior distributions efficiently,” explains Alahmadi. “This makes INLA significantly faster while still providing accurate results.”

The researchers demonstrated the power of their model by integrating point and areal data in three case studies: the prevalence of malaria in Madagascar, air pollution in the United Kingdom, and lung cancer risk in Alabama, USA. In all three, the model improved the speed and accuracy of predictions while providing insight into the importance of different spatial scales.

“In general, our model gives more weight to point data because they offer higher spatial precision and are often more reliable for detailed predictions,” says Alahmadi. “In all studies, point data played a dominant role. However, the influence of areal data was greater in the air pollution study. This is primarily because the air pollution areal data had a finer resolution, which made them more informative and complementary to the point data.”

Overall, the project addresses the increasing need for data analysis tools that support evidence-based decisions in health and environmental policy. For example, if public health officials can quickly assess disease prevalence, then they can work more effectively to allocate resources and intervene in high-risk areas.

The new model could be adapted to capture dynamic changes over space and time and to address biases that may arise due to preferential sampling in certain areas. The researchers plan several other applications of their model, such as using satellite pollution data to estimate disease risks.

“We hope to combine satellite and ground-based temperature data to detect thermal extremes in Mecca, particularly during the Hajj season, where heat stress is a serious public health concern,” says Moraga. “We also intend to monitor air pollutants and track emissions, supporting Saudi Arabia’s journey toward its net-zero goals.”

 

Reference

  1. Alahmadi, H. & Moraga, P. Bayesian modelling for the integration of spatially misaligned health and environmental data. Stochastic Environmental Research and Risk Assessment 30, 1485-1499 (2025).| article

 

Germinated flours in breadmaking: Striking a balance between nutrition and quality



Escuela Superior Politecnica del Litoral





A recent study explores the potential of germinated flours as functional ingredients in breadmaking, highlighting both their nutritional benefits and their technological challenges.

The growing demand for healthier foods has inspired the scientific community to reexamine traditional ingredients through a modern lens. One such example is germinated flours, which are emerging as a promising alternative for enhancing the nutritional profile of bread, one of the most widely consumed foods worldwide.

At the Food Science and Technology Laboratory at ESPOL, researchers comprehensively reviewed the latest findings on the use of germinated flours in breadmaking. Germination—an ancient technique that activates a grain’s natural enzymes—transforms its components, increasing the levels of resistant starch, antioxidants, and bioactive compounds. It can also enhance the digestibility of certain nutrients.

Thanks to these transformations, germinated flours are considered functional ingredients with the potential to help prevent or mitigate public health issues such as constipation, obesity, and metabolic disorders. However, incorporating them into bread recipes presents notable challenges.

Nutrition Gains—Without Sacrificing Bread Quality

One of the study’s key findings is that improving a bread’s nutritional value often comes at the expense of its technological qualities. Adding germinated flours can alter the dough’s texture, volume, and structure, ultimately impacting the consumer’s eating experience.

These challenges are particularly pronounced when using gluten-free grains or grains whose proteins interfere with gluten network formation, such as certain germinated legumes or pseudocereals. High percentages of germinated flour can cause bread to lose volume and softness, making the final product less appealing commercially. As a result, the current recommendations suggest substitution rates between 5% and 20%.

Grains, Germination Time, and Substitution Levels: Every Variable Matters

The ESPOL review emphasizes that not all grains react similarly to germination. While cereals such as wheat and barley show clear nutritional improvements, legumes and pseudocereals may not consistently translate those gains into better bread quality. Variables such as the grain type, germination duration, and the percentage of flour substitution must be carefully calibrated to balance health benefits and baking performance. Each grain has an “optimal germination point” that maximizes its nutritional value without compromising bread quality.

Food Technology Solutions to Close the Gap

To address these limitations, the study proposes several technological strategies. One approach is the use of dough improvers, which help stabilize the bread structure when flours weaken the gluten network, an important tool for expanding the use of germinated legumes and pseudocereals.

Another promising strategy is optimizing the germination process itself. Fine-tuning parameters such as temperature, humidity, and germination time could enable higher levels of germinated flour inclusion without compromising the bread’s sensory qualities.

Taste Comes First: The Importance of Consumer Acceptance

Beyond nutritional and technological considerations, the study stresses the crucial role of consumer perception. For a bread product to succeed commercially, it must be healthy and delicious, with an appealing texture, color, and aroma. Understanding consumer preferences and sensory expectations will be essential for successfully adopting these innovations.

Conclusion: Science for Better Nutrition

Germinated flours offer a unique opportunity to innovate in functional breadmaking, delivering meaningful nutritional benefits. However, their successful implementation requires an integrated approach that considers technological challenges and consumer preferences.

Through this study, ESPOL aims to contribute to the development of healthier and more sustainable foods, combining scientific knowledge with technological innovation to meet the evolving demands of modern nutrition.

New study: Teen drivers safer with more practice


IT USED TO BE TAUGHT IN HIGH SCHOOL


Virginia Tech
A young driver practices with a licensed driver. 

image: 

A young driver practices with a licensed driver.

view more 

Credit: Photo by Eric Holbrook for Virginia Tech





The adage “practice makes perfect” may not always ring true, but research from the Virginia Tech Transportation Institute provides new evidence that practice does make teen driving safer.  

Published in the National Academies of Sciences, Engineering, and Medicine, the study is the first to find statistically significant evidence that increased amounts of driving practice during the learner’s permit phase can help reduce crash and near-crash events for teen drivers.

By analyzing data collected by using four in-car cameras to monitor teens throughout both the permit and independent phases of driving, researchers found that teen drivers who practiced more had 30 percent fewer crash or near-crash incidents.

“Teens that practice driving, especially in a variety of environments, throughout their learner’s permit stage help reduce their crash risk and improve safety outcomes once they are out driving on their own,” said Charlie Klauer, a research scientist at the Virginia Tech Transportation Institute (VTTI), and lead author of the study. Klauer is also an associate professor in the Grado Department of Industrial and Systems Engineering.

The analysis studied the novel research of 82 teen drivers who were monitored for 22 months with the naturalistic driving study method pioneered at VTTI. The observation occurred from 2011-14 and included having four nonintrusive cameras installed into the teens’ vehicles during their learner’s permit phase and during the first 12 months of independent driving, which was the first time this population of drivers had been studied in this manner.

Done in collaboration with Johns Hopkins University and TransAnalytics Inc., a Philadelphia-based research company, the new analysis of the data revealed that teen drivers with less supervision had 17 crash or near-crash events per 1,000 hours of driving, while teens with more supervision had 12 crash or near crash events per 1,000 hours of driving.

Additionally, the study found:

  • Girls with less varied driving practice had a higher rate of safety-critical events than boys.
  • Teens who shared a family car had fewer risky driving behaviors than those with their own vehicles.
  • Teens who engage in more supervised driving in diverse conditions, such as during nighttime hours and on unfamiliar or varied road types, experience fewer crashes once they begin driving independently. 
  • Many of the teens didn’t meet the required 45 hours of driving time during the permit phase in Virginia. 

Funded by the National Academies of Science through the Behavioral Traffic Safety Cooperative Research Program, teenagers in the research phase drove nearly 110,000 miles during the learner’s permit phase and 380,000 miles during the independent phase.

Educating teens and their parents about the importance of driver safety has never been more urgent as crash fatalities involving young drivers ages 15–20 continue to rise. According to new data from the National Highway Traffic Safety Administration, 5,588 people were killed in crashes involving a teen driver in 2023, a 4.2 percent increase from 2022. 

Overall, the study recommends that driver education programs, policymakers, and parents prioritize safety education alongside supervised driving experiences to help increase the level of safety for novice drivers

 

Record-breaking performance in data security achieved with quantum mechanics



A new quantum random number generator is almost 1000 times faster than other generators and much smaller, promising to change data management and cybersecurity in several industries including health, finance, and defense



King Abdullah University of Science & Technology (KAUST)

Quantum random number generators for data security 

image: 

Quantum random number generators made from micro-LEDs show unprecedented speed and promise exceptional data security.

view more 

Credit: Heno Hwang (KAUST)





A joint team of researchers led by scientists at King Abdullah University of Science and Technology (KAUST) and King Abdulaziz City for Science and Technology (KACST) has reported the fastest quantum random number generator (QRNG) to date based on international benchmarks. The QRNG, which passed the required randomness tests of the National Institute of Standards and Technology, could produce random numbers at a rate nearly a thousand times faster than other QRNG.  

“This is a significant leap for any industry that depends on strong data security,” said KAUST Professor Boon Ooi, who led the study, which is published in Optics Express. KAUST Professor Osman Bakr also contributed to the study. 

Random number generators are critical for industries that depend on security, such as health, finance, and defense. But the random number generators currently used are vulnerable because of an intrinsic flaw in their design.  

“Most random number generators are ‘pseudo random number generators’. In other words, they seem random, but in reality, they are complicated algorithms that can be solved. QRNGs do not suffer from this concern,” explained Ooi.  

The reason is that QRNG use the principles of quantum mechanics to produce a truly unpredictable random number. The high random number generation rate reported in the new study was the result of innovations made by the scientists in the fabrication and the post-processing algorithms of the device.  

The QRNG was constructed using micro-LEDs (light emitting diodes) less than a few micrometers in size, which reduces their energy demands and suggests the QRNG are portable, expanding the types of applications. In addition, the National Institute of Standards and Technology is recognized internationally for providing benchmarks to ascertain the quality of randomness. 

Dr. Abdullah Almogbel, a contributor of the study who is also a researcher at the Microelectronics and Semiconductors Institute and director of the Center of Excellence for Solid-State Lighting at KACST, stated: “KACST, in its capacity as the national laboratory, is committed to advancing applied research that directly supports the objectives of Saudi Arabia’s Vision 2030—particularly in establishing global leadership across strategic sectors, including quantum-enabled innovations. Undertaking such research initiatives is expected to generate substantial value for a wide range of industries and further solidify their global standing.”