Thursday, May 18, 2023

Why do Japanese teachers seem unready to teach critical thinking in classrooms?

A study in Japan highlights how schoolteachers circumvent the influence of the globally promoted version of critical thinking in their emic (indigenous) teaching practice

Peer-Reviewed Publication

CHIBA UNIVERSITY

A typical classroom in a Japanese elementary school 

IMAGE: A NEW STUDY FINDS THAT JAPANESE SCHOOLTEACHERS HAVE UNINTENTIONALLY MODIFIED THE CONCEPT OF CRITICAL THINKING AND PRACTICE INDIGENOUS VERSIONS WHILE ADHERING TO THE NATIONAL CURRICULUM GUIDELINES THAT STRIVE TO ALIGN WITH GLOBAL EDUCATION STANDARDS. view more 

CREDIT: TONY CASSIDY FROM FLICKR (HTTPS://WWW.FLICKR.COM/PHOTOS/63712423@N00/363713806, HTTPS://COMMONS.WIKIMEDIA.ORG/WIKI/FILE:JAPANESECLASSROOM2.JPG)

Globally, critical thinking (CT) is regarded as a highly desirable cognitive skill that enables a person to question, analyze, and assess an idea or theory from multiple perspectives. CT has become an integral and mandatory part of global educational curricula, but its definition varies across contexts and cultural backgrounds.

 

To assess the implementation of CT, the Organization for Economic Cooperation and Development (OECD) conducts the Teaching and Learning International Survey (TALIS). In a 2018 survey (TALIS 2018), only 12.6% of lower secondary schoolteachers in Japan taught CT, compared to the global average of 58.1%. To understand this gap, Assistant Professor Kazuyuki Nomura from Chiba University, Japan, set out to understand the reason underlying Japan’s dismal performance in TALIS 2018. The study was made available online on March 31, 2023, and published in Educational Philosophy and Theory.

 

Japan’s poor scoring in teaching CT puzzled me after I read the TALIS 2018 Results. Granted, CT varies from context to context, but little research has been conducted to understand what CT means in Japanese classrooms. Therefore, I took it on,” explains Dr. Nomura, an expert in intercultural education, speaking of his motivation for the study.

 

In this qualitative study, Dr. Nomura conducted semi-structured interviews with 12 certified Japanese schoolteachers from diverse backgrounds, including one from a school for children with special educational needs (SENs), between May and July 2022 to understand their perspectives on CT and the outcomes of TALIS 2018.

 

All participants agreed that the Japanese adjective “hihanteki” for the English word “critical” had a negative undertone, making most teachers hesitant about introducing CT to their classrooms. Most of the participants also felt that the use of “hihanteki shikoo” as a direct translation of the words “critical thinking” in the TALIS 2018 was not correct. Many teachers were unaware that the national curriculum guidelines contain the concept of CT. Teachers who knew about it and had tried integrating it into their pedagogy were in a sheer minority.

 

By contrast, multidimensional-multiperspective thinking (MMT), which is a facet of CT, enjoys more acceptance and popularity in Japan since the curriculum guidelines explicitly promote MMT. The study found that while most participants shied away from implementing CT, they were comfortable with MMT and used it regularly in their teaching. Additionally, almost all participants suggested that teaching MMT depends on socioeconomic status (SES), and that teaching MMT in low-SES schools was challenging.

 

Moreover, “empathy” is a hidden, central pillar of the Japanese school curriculum. All participants concurred that empathy is a core value in Japanese education, and most of them felt that it is impossible to practice CT or MMT without teaching empathy. Coupling MMT with empathy increased the autonomy of schoolteachers in Japan. They struck a delicate balance between the national curriculum, the school culture, and their teaching practice. As a result, they managed to circumvent the power of OECD-led global education.

 

Still, Dr. Nomura maintains that the merits of global CT can be combined with the Japanese teachers’ emic understanding of CT to get the best of both worlds. Through CT skills and empathy, Japanese teachers can enable a growingly diverse student population to think big and beyond the perimeters of Japan. Although CT and politics go hand-in-hand, as per the law, Japanese schools cannot teach politically controversial topics and must maintain neutrality. To remedy this, he recommends that teachers use non-Japanese or fictional examples to implement CT in their classrooms.

 

As Japan is now becoming increasingly multicultural, teachers could use CT skills to help children think about Japan’s future, enhance open-mindedness, and live together with others with equity and dignity,” suggests Dr. Nomura.

 

For future research, classroom observations could be the best candidate, given that MMT is a core educational goal in the current national curriculum. The results of this study could thus help spark a much-needed discussion about CT teaching in Japan and elsewhere.

 

About Assistant Professor Kazuyuki Nomura from Chiba University

Dr. Kazuyuki Nomura is currently an Assistant Professor at the Graduate School of Global and Interdisciplinary Studies, Chiba University in Japan. Dr. Nomura is an educational researcher, and his research areas cover intercultural education and curriculum studies. He has published articles in reputed journals to his credits and has also co-authored several books. Dr. Nomura is a member of many educational research associations and has received accolades for his research contributions.

A new tool for deforestation detection

Peer-Reviewed Publication

JOURNAL OF REMOTE SENSING

NLCD 1986-2019 forest disturbance product 

IMAGE: FOUR LANDSAT PATH/ROW FOOTPRINTS IN PURPLE, WHICH WERE SELECTED FOR ACCURACY ASSESSMENT, ARE OVERLAID ON THE NATIONAL LAND COVER DATABASE (NLCD) FOREST DISTURBANCE DATE 1986-2019 SCIENCE PRODUCT. view more 

CREDIT: SUMING JIN, U.S. GEOLOGICAL SURVEY EARTH RESOURCES OBSERVATION AND SCIENCE CENTER

Every second, the planet loses a stretch of forest equivalent to a football field due to

logging, fires, insect infestation, disease, wind, drought, and other factors. In a recently published study, researchers from the U.S. Geological Survey Earth Resources Observation and Science (EROS) Center presented a comprehensive strategy to detect when and where forest disturbance happens at a large scale and provide a deeper understanding of forest change.

 

The study was published on Feb. 28 in the Journal of Remote Sensing.
 

“Our strategy leads to more accurate land cover mapping and updating,” said Suming Jin, a physical scientist with the EROS Center.

 

To understand the big picture of a changing landscape, scientists rely on the National Land Cover Database, which turns Earth-observation satellite (Landsat) images into pixel-by-pixel maps of specific features. Between 2001 and 2016, the database showed that nearly half of the land cover change in the contiguous United States involved forested areas.

 

“To ensure the quality of National Land Cover Database land cover and land cover change products, it is important to accurately detect the location and time of forest disturbance,” said Jin.

 

Jin and team developed a method to detect forest disturbance by year. The approach combines strengths from a time-series algorithm and a 2-date detection method to improve large-region operational mapping efficiency, flexibility, and accuracy. The new technique facilitates more effective forest management and policy, among other applications.

 

Landsat data have been widely used to detect forest disturbance because of their long history, high spatial and radiometric resolutions, free and open data policy, and suitability for creating continental or even global mosaic images for different seasons.

 

“We need algorithms that can create consistent large-region forest disturbance maps to assist producing multi-epoch National Land Cover Database,” said Jin. “We also need those algorithms to be scalable so we can track forest change over longer periods of time.”

 

A commonly employed method called “2-date forest change detection” involves comparing images from two different dates while the “time-series algorithm” can provide observations for yearly or even monthly Landsat time series.
 

In general, 2-date forest change detection algorithms are more flexible than time-series methods and use richer spectral information. The 2-date method can easily determine changes between image bands, indices, classifications, and combinations and, therefore, detect forest disturbances more accurately. However, the 2-date method only detects changes for one time period and usually requires additional information or further processing to separate forest changes from other land cover changes.

 

On the other hand, time-series-based forest change detection algorithms can use spectral and long-term temporal information and produce changes for multiple dates simultaneously. However, these methods usually require every step of the time series algorithm to be processed again when a new date is added, which can be cumbersome for continuous monitoring updates and lead to inconsistencies.

 

Previous studies proposed ensemble approaches to improve forest change mapping accuracy, including “stacking,” or combining the output of different mapping methods. While stacking reduces omission and commission error rates, the method is computationally intensive and requires reference data for training.

 

Jin and team’s approach combined strengths from 2-date change detection methods and the continuous time-series change detection method, which was called the Time-Series method Using Normalized Spectral Distance (NSD) index (TSUN), to improve large-region operational mapping efficiency, flexibility, and accuracy. Using this combination, the researchers produced the NLCD 1986–2019 forest disturbance product, which shows the most recent forest disturbance date between the years 1986 and 2019 for every two-to-three-year interval.

 

“The TSUN index detects multi-date forest land cover changes and was shown to be easily extended to a new date even when new images were processed in a different way than previous date images,” Jin said.

 

The research team plans to improve the tool by increasing the time frequency and produce an annual forest disturbance product from 1986 to present.

 

“Our ultimate goal is to automatically produce forest disturbance maps with high accuracy with the capability of continually monitoring forest disturbance, hopefully in real-time,” Jin said.  

 

This work was supported by the USGS-NASA Landsat Science Team Program for Toward Near Real-time Monitoring and Characterization of Land Surface Change for the Conterminous US.

 

Other contributors include Jon Dewitz from the U.S. Geological Survey Earth Resources Observation and Science (EROS) Center; Congcong Li with ASRC Federal Data Solutions; Daniel Sorenson with the U.S. Geological Survey; Zhe Zhu with the University of Connecticut; Md Rakibul Islam Shogib, Patrick Danielson and Brian Granneman from KBR; Catherine Costello with the U.S. Geological Survey, Geosciences and Environmental Change Science Center; Adam Case with Innovate! Inc.; and Leila Gass with the U.S. Geological Survey, Western Geographic Science Center.

To avoid a battery crisis, more of us should share small, lightweight EVs

EVs are better than fossil fuel cars, but it's not sustainable for everyone to own their own electric SUVs

Peer-Reviewed Publication

NORWEGIAN UNIVERSITY OF SCIENCE AND TECHNOLOGY

Smaller EVs better for the planet 

IMAGE: RESEARCHERS SAY MORE OF US SHOULD SHARE CARS— PREFERABLY, SMALLER, ELECTRIC ONES, LIKE THESE NISSAN LEAFS, SHOWN HERE CHARGING IN FRONT OF ST. OLAVS HOSPITAL IN TRONDHEIM, NORWAY. view more 

CREDIT: PHOTO: NANCY BAZILCHUK, NTNU

Most global scenarios and governmental targets for decarbonizing the transport sector consider battery-powered electric vehicles as a main part of the solution. Enormous amounts of raw materials are needed to build enough batteries and ensure a transition to low-emission vehicles.

Access to lithium is critical, as it is used in all types of EV batteries.

Future demand needs to decrease

“It seems very likely we'll have a shortage. The key lies in the demand. The demand needs to decrease to avoid long-term supply problems,” says Fernando Aguilar Lopez, a PhD candidate at the Norwegian University of Science and Technology's (NTNU) Department of Energy and Process Engineering.

There are huge deposits of this super-light, silver-white substance around the world. The problem is that it is not being mined fast enough to keep up with demand for lithium-ion batteries (LIBs) for electric vehicles.

As a result, we may face supply bottlenecks that could last for decades. 

Meet MATILDA — A model for supply scenarios

Fernando Aguilar Lopez is an expert in analysing global material flows. This means he studies raw materials cycles from extraction to production, use, and scrapping. He, postdoc Romain Billy and Professor Daniel B. Müller have developed a material flow analysis model named MATILDA (MATerIaL Demand and Availability). The model was recently introduced in the Journal "Resources, Conservation & Recycling."

The study investigates strategies to manage resource use in EV batteries. MATILDA helps to understand more of the critical factors affecting resource supply. Also, the model calculates how various interventions could mitigate the demand.

Most comprehensive model

“Society urgently needs systemic approaches for addressing supply problems,” Müller said.

MATILDA is the most comprehensive model to date for battery raw materials in the global vehicle fleet. Tools such as this can play a crucial role for industry and policy makers to develop strategies to ensure secure and resilient critical raw materials supply chains, Müller said. 

Explored more than 8 000 scenarios

The NTNU researchers have explored more than 8 000 scenarios to understand the key drivers of material use.

Reading this on your laptop and want to explore the details? Check out BATMAN's interactive visualization tool, where you can build your own scenario and check the resource use for the nine materials the researchers have investigated. http://129.241.153.168:8051/

This assessment showed that profound social and lifestyle changes are the most efficient measures to reduce material supply risks.

More sharing of smaller EVs

To avoid excessive demand for single metals, we need investments in a wider range of new battery technologies, the researchers said.

Nor do we need more, larger, heavier electric cars. On the contrary, more of us should share smaller cars with small battery packs.

“Extending the lifespan of vehicles and batteries by facilitating reuse and replacement will also be crucial in reducing the demand for raw materials,” says Aguilar Lopez.

Identifying problem shifting

Reducing the demand for certain materials could increase pressure on others. MATILDA shows what happens when this kind of problem shifting happens with cobalt, nickel and lithium. The model also offers alternative solutions.

The researchers says that problem shifting may be critical if the industry collectively shifts towards a new technology at a specific time. One example is the current trend toward lithium iron phosphate batteries (LFP).

Battery trends could affect food prices

In 2021, manufacturers as VW, Volvo and Tesla said they planned to adopt LFP. These batteries are free of problematic, costly substances such as cobalt and nickel.

The downside is that they require a lot of phosphorus, an essential raw material for the fertilizer industry. Thus, an increase in demand could be susceptible to price shocks, potentially affecting small farmers, and threatening food prices.

Recycling not a solution in the near term

Another finding in the NTNU study is that although necessary, recycling will not significantly reduce the pressure on raw materials in the coming decade. Our EVs are still reasonably new, and not enough of them will be scrapped for recycling until 10 to 15 years from now.

However, MATILDA shows that primary demand can be reduced by improving efficiencies in lithium, aluminium, manganese, and phosphorus recycling. These materials are currently uneconomical to recycle and are not included in the latest EU battery regulations.

Extending lifetime can be crucial

Without incentives to recover these materials, they are likely to be lost to the environment, Aguilar Lopez said.

He adds that a key issue is that the proposed EU regulations only target the supply side, but not demand. This means that badly needed changes are not actively encouraged.

Warp speed into bottlenecks

Half of new car sales in the USA are predicted to be electric by 2030. This is also the target in a recent Executive Order issued by US President Joe Biden. In the EU, all new car sales must be electric by 2035.

Several car manufacturers have said that they will be able to make the transition several years before this deadline.

This means if we don’t take measures quickly to increase the production of battery materials, we will be driving at warp speed right into the supply bottleneck.

Bigger cars mean bigger batteries

According to this Climate and Community project report, the average battery pack in the US has increased in capacity by nearly threefold since the first Nissan Leaf hit the road a decade ago.

“We’re driving ever larger, heavier cars with massive battery packs. At the same time, we only use these vehicle actively about five per cent of the time. The rest of the time, it’s parked. Only a few of us drive further than 45 kilometres daily,” Aguilar Lopez said.

Small is beautiful

He sees a lot of indicators that fewer people need to own their own car, and that more of us could be OK with sharing smaller, lighter vehicles. In other words, we need more 30-40 kwh Nissan Leaf-type batteries instead of the 60-100 kwh batteries that are found in most Teslas and SUVs.

“Norwegian politicians have chosen to no longer subsidize the largest and most expensive electric cars. This is one example of a powerful measure that indirectly favours smaller cars,” he said.

The researcher believes that moves like this are essential and can inspire both individual countries and the EU.

More than 300 new mines needed

Analyses show that the world needs more than 300 new lithium mines by 2035 to keep up with demand. Prices have risen by several hundred per cent in a few years There is great interest in extraction, but disputes have erupted over new mines in a number of places.

The newest lithium mine in Europe opened around 10 years ago. Some reports say it may take up to 20 years to get a new one up and running. Many initiatives fail and are shut down.

“Ideally, we should have started preparing for this situation 20 years ago,” Aguilar Lopez said.

Buses and ferries worsen bottlenecks

As if that wasn't bad news enough, the new NTNU study only considers resource consumption by private vehicles. But buses, ferries, and other large vessels are also being electrified using batteries.

Machinery is another area where changes are afoot. Müller said the entire mining sector is completely shifting towards electrification and automation.

These additional demands could quickly worsen supply bottlenecks for raw materials.

“Achieving the goals set by the EU and individual countries could thus be problematic. This would also create a significant threat to the climate goals,” Aguilar Lopez said.

Promising technologies require more lithium

Although the development of highly efficient solid-state batteries is promising, they do not solve the lithium supply crisis either.

“Actually, solid-state batteries can worsen the situation, requiring more lithium per kilowatt hour — almost twice as much in some cases,” Aguilar Lopez said.

Challenges everywhere

Hydrogen fuel cells are becoming a mature technology, but they are expensive. They will be far more effective in ferries and other heavy vessels than in cars.

Lithium-free sodium-ion batteries are another promising alternative. They are in an early R&D stage, so it will take a while before they are road-ready.

“No matter where we turn, we encounter challenges,” Aguilar Lopez said.

Cities for people — not cars

So, what would be the best option to get us out of the lithium supply crunch?

Aguilar Lopez’s immediate answer is that cities should be designed for people, not cars.

“Look at Zürich, Vienna, Paris, and Oslo. Major moves are being made in many places to create more pleasant urban areas and entice more people to leave their cars. And quite a lot are doing just that. We need regulations– and of course people must accept them,” Aguilar Lopez said.

BATMAN project proposes new measures

Norway was a first mover in terms of electrifying the vehicle fleet. Thus, it is expected to be the first country to have larger amounts of spent batteries once the first EVs reach end-of-life.

Müller, Aguilar Lopez and Guillaume Billy were part of the research team for the BATMAN project, From 2019-2022. The project was designed to investigate opportunities for the Norwegian industry to engage in battery reuse and recycling.

Based on the main findings, the scientists suggested several strong actions:

  • Facilitate and finance a broader range of battery research geared towards breakthroughs in chemistries and recycling practices.
  • Step up and streamline the development of new lithium mines, also in the EU.
  • Increase car sharing and better public transport to reduce overall vehicle ownership.
  • More small cars with smaller battery packs.
  • Greater numbers of and more efficient charging infrastructure.
  • Facilitate the recycling of all materials beyond the ones that are already regulated.
  • Extend car and battery life via increased reuse and easier repair and replacement of batteries.

Reference: Aguilar Lopez, F..; Billy, R.G.; Müller, D.B. Evaluating strategies for managing resource use in lithium-ion batteries for electric vehicles using the global MATILDA model. Journal of Resources, Conservation and Recycling

Space missions set to improve solar storm forecasts

Peer-Reviewed Publication

UNIVERSITY OF READING

Satellites launched into outer space could send back improved warnings of dangerous solar storms thanks to a breakthrough in the way scientists use space weather measurements.   

Experts from the University of Reading have found that using satellite data that is less reliable but is returned to Earth rapidly can be used to improve the accuracy of solar wind forecasts - which are harmful streams of charged particles sent from the sun - by nearly 50 per cent.

Their research, published today (Thursday, 18 May) in Space Weather, could pave the way for agencies, such as the Met Office, to provide more accurate forecasts for severe space weather, which can cause blackouts and harm human health.

Lead researcher Harriet Turner, from the University of Reading’s Department of Meteorology, said: “We know lots about how to prepare for storms that form on Earth, but we need to improve our forecasts of the dangerous weather we get from space. Space weather threatens our technology-focused way of life as it can cause power grids to fail, damage satellites, such as GPS, and even make astronauts ill. 

“Our research has shown that using rapid satellite measurements to forecast space weather is effective. By sending spacecraft far from Earth, we can use this new technique to get better solar storm predictions and ensure we are prepared for what’s to come.”

Simon Machin, Met Office Space Weather Manager, said: “This is a great example of the value that can result through our collaboration with academia. By pulling through scientific research into the operational domain, improved space weather forecasting will ultimately enhance our nation's ability to prepare for and mitigate against space weather events.”

Old dogs and new tricks

To predict space weather, scientists need to forecast the solar wind conditions at Earth. To do this, they combine computer simulations with observations from space to estimate what space weather will be like. This is known as data assimilation. The highest quality observations only become available many days after they are made, as they are processed on the ground and ‘cleaned’, meaning forecasts take longer to achieve.  

To obtain forecasts faster, the research team tried using near-real-time (NRT) data. NRT data undergoes no processing or cleaning, meaning it is less accurate but can be made available within a couple of hours. The research team found that forecasts produced using the NRT data still produces reliable predictions and enables greater warning time. This could enable authorities to better prepare for power failures that cost up to 2.1 trillion dollars over a century in the USA and Europe. 

To the stars

The scientists behind this new study say using this new technique with upcoming space missions will enable better forecasts.

The European Space Agency (ESA) will launch ‘Vigil’ in the mid-2020s, a first-of-its-kind mission that will monitor potentially hazardous solar activity using a number of UK-built instruments. 

By launching the spacecraft into a position 60 degrees behind Earth in longitude, the Met Office will be able to improve space weather forecasts by using data assimilation of the NRT solar wind data. 

It is hoped the unique location of Vigil will allow scientists to see the solar wind that will later arrive at Earth, maximising forecast accuracy and warning time.

 

Perfect ‘pathogen’ storm: Vibrio bacteria, Sargassum and plastic marine debris

Genomics study in Caribbean, Sargasso Seas represents first vibrio assembled from plastic debris

Peer-Reviewed Publication

FLORIDA ATLANTIC UNIVERSITY

Beached Sargassum, brown macroalga 

IMAGE: SOME CULTIVATION-BASED DATA SHOW BEACHED SARGASSUM APPEAR TO HARBOR HIGH AMOUNTS OF VIBRIO BACTERIA. view more 

CREDIT: BRIAN LAPOINTE, FAU HARBOR BRANCH

A new study uncovers how the interplay between Sargassum spp., plastic marine debris and Vibrio bacteria creates the perfect “pathogen” storm that has implications for both marine life and public health. Vibrio bacteria are found in waters around the world and are the dominant cause of death in humans from the marine environment. For example, Vibrio vulnificus, sometimes referred to as flesh-eating bacteria, can cause life-threatening foodborne illnesses from seafood consumption as well as disease and death from open wound infections.

Since 2011, Sargassum, free-living populations of brown macroalga, have been rapidly expanding in the Sargasso Sea and other parts of the open ocean such as the Great Atlantic Sargassum Belt, including frequent and unprecedented seaweed accumulation events on beaches. Plastic marine debris, first found in surface waters of the Sargasso Sea, has become a worldwide concern, and is known to persist decades longer than natural substrates in the marine environment.

Currently, little is known about the ecological relationship of vibrios with Sargassum. Moreover, genomic and metagenomic evidence has been lacking as to whether vibrios colonizing plastic marine debris and Sargassum could potentially infect humans. As summer kicks into high gear and efforts are underway to find innovative solutions to repurpose Sargassum, could these substrates pose a triple threat to public health?

Researchers from Florida Atlantic University and collaborators fully sequenced the genomes of 16 Vibrio cultivars isolated from eel larvae, plastic marine debris, Sargassum, and seawater samples collected from the Caribbean and Sargasso seas of the North Atlantic Ocean. What they discovered is Vibrio pathogens have the unique ability to “stick” to microplastics and that these microbes might just be adapting to plastic. 

“Plastic is a new element that’s been introduced into marine environments and has only been around for about 50 years,” said Tracy Mincer, Ph.D., corresponding lead author and an assistant professor of biology at FAU’s Harbor Branch Oceanographic Institute and Harriet L. Wilkes Honors College. “Our lab work showed that these Vibrio are extremely aggressive and can seek out and stick to plastic within minutes. We also found that there are attachment factors that microbes use to stick to plastics, and it is the same kind of mechanism that pathogens use.”

The study, published in the journal Water Research, illustrates that open ocean vibrios represent an up to now undescribed group of microbes, some representing potential new species, possessing a blend of pathogenic and low nutrient acquisition genes, reflecting their pelagic habitat and the substrates and hosts they colonize. Utilizing metagenome-assembled genome (MAG), this study represents the first Vibrio spp. genome assembled from plastic debris.

The study highlighted vertebrate pathogen genes closely related to cholera and non-cholera bacterial strains. Phenotype testing of cultivars confirmed rapid biofilm formation, hemolytic and lipophospholytic activities, consistent with pathogenic potential.

Researchers also discovered that zonula occludens toxin or “zot” genes, first described in Vibrio cholerae, which is a secreted toxin that increases intestinal permeability, were some of the most highly retained and selected genes in the vibrios they found. These vibrios appear to be getting in through the gut, getting stuck in the intestines and infecting that way.

“Another interesting thing we discovered is a set of genes called ‘zot’ genes, which causes leaky gut syndrome,” said Mincer. “For instance, if a fish eats a piece of plastic and gets infected by this Vibrio, which then results in a leaky gut and diarrhea, it’s going to release waste nutrients such nitrogen and phosphate that could stimulate Sargassum growth and other surrounding organisms.”

Findings show some Vibrio spp. in this environment have an ‘omnivorous’ lifestyle targeting both plant and animal hosts in combination with an ability to persist in oligotrophic conditions. With increased human-Sargassum-plastic marine debris interactions, associated microbial flora of these substrates could harbor potent opportunistic pathogens. Importantly, some cultivation-based data show beached Sargassum appear to harbor high amounts of Vibrio bacteria.

“I don’t think at this point, anyone has really considered these microbes and their capability to cause infections,” said Mincer. “We really want to make the public aware of these associated risks. In particular, caution should be exercised regarding the harvest and processing of Sargassum biomass until the risks are explored more thoroughly.”

Study co-authors represent the NIOZ Royal Netherlands Institute for Sea Research, the Japan Agency for Marine-Earth Science and Technology, the Ludwig Maximilian University of Munich, Germany, Emory University, the University of Amsterdam and the Marine Biological Laboratory.

This research was supported by the National Science Foundation (NSF) (grant OCE-1155671 awarded to Mincer), FAU World Class Faculty and Scholar Program (awarded to Mincer), NSF (grant OCE-1155571 awarded to Linda A. Amaral-Zettler, Ph.D., corresponding author, NIOZ), NSF (grant OCE-1155379 awarded to Erik R. Zettler, Ph.D., co-author, NIOZ), NSF TUES grant (DUE-1043468 awarded to Linda Zettler and Erik Zettler).

Hemolytic Activity 

- FAU -

About Florida Atlantic University:
Florida Atlantic University, established in 1961, officially opened its doors in 1964 as the fifth public university in Florida. Today, the University serves more than 30,000 undergraduate and graduate students across six campuses located along the southeast Florida coast. In recent years, the University has doubled its research expenditures and outpaced its peers in student achievement rates. Through the coexistence of access and excellence, FAU embodies an innovative model where traditional achievement gaps vanish. FAU is designated a Hispanic-serving institution, ranked as a top public university by U.S. News & World Report and a High Research Activity institution by the Carnegie Foundation for the Advancement of Teaching. For more information, visit www.fau.edu.

 

A better way to study ocean currents

A new machine-learning model makes more accurate predictions about ocean currents, which could help with tracking plastic pollution and oil spills, and aid in search and rescue

Reports and Proceedings

MASSACHUSETTS INSTITUTE OF TECHNOLOGY

Fluid currents 

IMAGE: COMPUTER SCIENTISTS AT MIT JOINED FORCES WITH OCEANOGRAPHERS TO DEVELOP A MACHINE-LEARNING MODEL THAT INCORPORATES KNOWLEDGE FROM FLUID DYNAMICS TO GENERATE MORE ACCURATE PREDICTIONS ABOUT THE VELOCITIES OF OCEAN CURRENTS. THIS FIGURE SHOWS DRIFTING BUOY TRAJECTORIES IN THE GULF OF MEXICO SUPERIMPOSED ON SURFACE CURRENTS. THE RED DOTS MARK THE BUOYS’ POSITIONS ON MARCH 9, 2016, AND THE TAILS ARE 14 DAYS LONG. view more 

CREDIT: EDWARD RYAN AND TAMAY ÖZGÖKMEN FROM THE UNIVERSITY OF MIAMI

CAMBRIDGE, Mass. -- To study ocean currents, scientists release GPS-tagged buoys in the ocean and record their velocities to reconstruct the currents that transport them. These buoy data are also used to identify “divergences,” which are areas where water rises up from below the surface or sinks beneath it.

By accurately predicting currents and pinpointing divergences, scientists can more precisely forecast the weather, approximate how oil will spread after a spill, or measure energy transfer in the ocean. A new model that incorporates machine learning makes more accurate predictions than conventional models do, a new study reports.

A multidisciplinary research team including computer scientists at MIT and oceanographers has found that a standard statistical model typically used on buoy data can struggle to accurately reconstruct currents or identify divergences because it makes unrealistic assumptions about the behavior of water. 

The researchers developed a new model that incorporates knowledge from fluid dynamics to better reflect the physics at work in ocean currents. They show that their method, which only requires a small amount of additional computational expense, is more accurate at predicting currents and identifying divergences than the traditional model.

This new model could help oceanographers make more accurate estimates from buoy data, which would enable them to more effectively monitor the transportation of biomass (such as Sargassum seaweed), carbon, plastics, oil, and nutrients in the ocean. This information is also important for understanding and tracking climate change.

“Our method captures the physical assumptions more appropriately and more accurately. In this case, we know a lot of the physics already. We are giving the model a little bit of that information so it can focus on learning the things that are important to us, like what are the currents away from the buoys, or what is this divergence and where is it happening?” says senior author Tamara Broderick, an associate professor in MIT’s Department of Electrical Engineering and Computer Science (EECS) and a member of the Laboratory for Information and Decision Systems and the Institute for Data, Systems, and Society.

Broderick’s co-authors include lead author Renato Berlinghieri, an electrical engineering and computer science graduate student; Brian L. Trippe, a postdoc at Columbia University; David R. Burt and Ryan Giordano, MIT postdocs; Kaushik Srinivasan, an assistant researcher in atmospheric and ocean sciences at the University of California at Los Angeles; Tamay Özgökmen, professor in the Department of Ocean Sciences at the University of Miami; and Junfei Xia, a graduate student at the University of Miami. The research will be presented at the International Conference on Machine Learning.

Diving into the data

Oceanographers use data on buoy velocity to predict ocean currents and identify “divergences” where water rises to the surface or sinks deeper. 

To estimate currents and find divergences, oceanographers have used a machine-learning technique known as a Gaussian process, which can make predictions even when data are sparse. To work well in this case, the Gaussian process must make assumptions about the data to generate a prediction.

A standard way of applying a Gaussian process to oceans data assumes the latitude and longitude components of the current are unrelated. But this assumption isn’t physically accurate. For instance, this existing model implies that a current’s divergence and its vorticity (a whirling motion of fluid) operate on the same magnitude and length scales. Ocean scientists know this is not true, Broderick says. The previous model also assumes the frame of reference matters, which means fluid would behave differently in the latitude versus the longitude direction.

“We were thinking we could address these problems with a model that incorporates the physics,” she says.

They built a new model that uses what is known as a Helmholtz decomposition to accurately represent the principles of fluid dynamics. This method models an ocean current by breaking it down into a vorticity component (which captures the whirling motion) and a divergence component (which captures water rising or sinking).

In this way, they give the model some basic physics knowledge that it uses to make more accurate predictions. 

This new model utilizes the same data as the old model. And while their method can be more computationally intensive, the researchers show that the additional cost is relatively small. 

Buoyant performance

They evaluated the new model using synthetic and real ocean buoy data. Because the synthetic data were fabricated by the researchers, they could compare the model’s predictions to ground-truth currents and divergences. But simulation involves assumptions that may not reflect real life, so the researchers also tested their model using data captured by real buoys released in the Gulf of Mexico.

In each case, their method demonstrated superior performance for both tasks, predicting currents and identifying divergences, when compared to the standard Gaussian process and another machine-learning approach that used a neural network. For example, in one simulation that included a vortex adjacent to an ocean current, the new method correctly predicted no divergence while the previous Gaussian process method and the neural network method both predicted a divergence with very high confidence.

The technique is also good at identifying vortices from a small set of buoys, Broderick adds.

Now that they have demonstrated the effectiveness of using a Helmholtz decomposition, the researchers want to incorporate a time element into their model, since currents can vary over time as well as space. In addition, they want to better capture how noise impacts the data, such as winds that sometimes affect buoy velocity. Separating that noise from the data could make their approach more accurate.

“Our hope is to take this noisily observed field of velocities from the buoys, and then say what is the actual divergence and actual vorticity, and predict away from those buoys, and we think that our new technique will be helpful for this,” she says.

This research is supported, in part, by the Office of Naval Research, a National Science Foundation (NSF) CAREER Award, and the Rosenstiel School of Marine, Atmospheric, and Earth Science at the University of Miami.

###

Written by Adam Zewe, MIT News Office

Additional background

Paper: “Gaussian Processes at the Helm(holtz): A More Fluid Model for Ocean Currents”

https://arxiv.org/pdf/2302.10364.pdf