If we start hearing coral reef techno, we might have another problem entirely.
By Cassidy Ward
Coral reef scenery with Red Sea bannerfish (Heniochus intermedius), golden butterflyfish (Chaetodon semilarvatus), orange face or hooded butterflyfish (Chaetodon larvatus) and lyretail anthias or goldies (Pseudanthias squamipinnis).
Photo: Georgette Douwma/Getty Images
The animated world of the Trolls universe hinges on the titular characters’ love for music. In the sequel, Trolls World Tour we learned that Trolls have significant diversity, with disparate groups living in separate regions of their world, each with their own genre preferences. Among them, living in an underwater aquatic environment, are the Techno Trolls who make their music at Techno Reef.
The bombastic electronic musical stylings of the Techno Trolls were seemingly at odds with the peaceful silence we often think of when we consider coral reef ecosystems in the real world. It turns out, we might have been wrong about that. Not only do coral reef systems make sounds, but scientists can use those sounds to gauge the overall health of the ecosystem. Now, new research has taken the music of the reef and given it a decidedly electronic bent by introducing artificial intelligence.
Ben Williams from the College of Life and Environmental Sciences at the University of Exeter, and colleagues, trained an artificial intelligence to listen to audio recordings of reef systems and determine from the sounds alone whether or not the reef is healthy. Their findings were published in the journal Ecological Indicators.
To be clear, the corals themselves aren’t making much noise, and if they are, we aren’t picking it up with our hydrophones, but the communities they support are alive with the sound of music.
“We hear snapping shrimp everywhere, it’s like the crackling of a campfire in the background,” Williams told SYFY WIRE. “On a thriving reef, there will be a lot of fish sounds. They make all kinds of whoops and grunts. Sometimes they even chorus, you’ll hear them for minutes or hours at a time, producing the same sounds across the reef.”
All of those sounds are an indication of the reef’s health. As a reef’s health starts to decline, many of those sounds disappear. Scientists might still pick up the sound of snapping shrimp, but all of those extra layers, the sounds of fish communicating and interacting with the environment, are gone.
In the past, scientists have analyzed recordings of reefs manually in a process that requires a certain amount of expertise and a lot of patience. For this study, they wanted to pass off some of that work to a computer. Perhaps surprisingly, the artificial intelligence learned to correctly differentiate between healthy and unhealthy reefs almost immediately.
“We didn’t need to feed it many recordings. WE had about 150 minutes from our healthy and degraded reefs, split about 50/50 and that’s all we needed. Then we started trying new recordings and it was able to get about 92% accuracy,” Williams said.
That means that researchers could drop hydrophones around reef systems all over the world and collect data over the course of days or months then play those recordings to the A.I. and get data about the health of the reef over time.
Scientists hope they might be able to use this system to monitor restoration efforts and get an idea of when reefs reach a tipping point at which the A.I. starts to recognize them as healthy again. At present, it’s unclear precisely what the system is listening for to make its determinations.
“With artificial intelligence, it’s kind of a black box. It does its job really well, but we don’t always know what patterns it has learned. Currently, it’s a binary healthy or unhealthy signal,” Williams said.
Going forward, scientists want to pair their artificially intelligent listening system with visual surveys to try and pinpoint exactly when and why a reef moves from unhealthy to healthy or vice versa. It could be the return of a specific species of fish or when coral growth reaches a certain density, or it could be something we’re not even aware of yet, but a better listening system could help unlock those mysteries.
“Typically, when we’re monitoring, we’re only getting a snapshot of the time when we’re there. A real bonus of this is we can just drop the hydrophone in the water, disappear, and come back later. That allows us to get long-term continuous data sets,” Williams said.
Marine ecosystems are particularly well-suited for this sort of work because water is a perfect medium for sound to travel in, but there is potential to take this eavesdropping A.I. into terrestrial environments as well. Animal ecosystems in rainforests and grasslands might also give us hints as to their health based on the richness of their song. We just need to sit back and listen.
The animated world of the Trolls universe hinges on the titular characters’ love for music. In the sequel, Trolls World Tour we learned that Trolls have significant diversity, with disparate groups living in separate regions of their world, each with their own genre preferences. Among them, living in an underwater aquatic environment, are the Techno Trolls who make their music at Techno Reef.
The bombastic electronic musical stylings of the Techno Trolls were seemingly at odds with the peaceful silence we often think of when we consider coral reef ecosystems in the real world. It turns out, we might have been wrong about that. Not only do coral reef systems make sounds, but scientists can use those sounds to gauge the overall health of the ecosystem. Now, new research has taken the music of the reef and given it a decidedly electronic bent by introducing artificial intelligence.
Ben Williams from the College of Life and Environmental Sciences at the University of Exeter, and colleagues, trained an artificial intelligence to listen to audio recordings of reef systems and determine from the sounds alone whether or not the reef is healthy. Their findings were published in the journal Ecological Indicators.
To be clear, the corals themselves aren’t making much noise, and if they are, we aren’t picking it up with our hydrophones, but the communities they support are alive with the sound of music.
“We hear snapping shrimp everywhere, it’s like the crackling of a campfire in the background,” Williams told SYFY WIRE. “On a thriving reef, there will be a lot of fish sounds. They make all kinds of whoops and grunts. Sometimes they even chorus, you’ll hear them for minutes or hours at a time, producing the same sounds across the reef.”
All of those sounds are an indication of the reef’s health. As a reef’s health starts to decline, many of those sounds disappear. Scientists might still pick up the sound of snapping shrimp, but all of those extra layers, the sounds of fish communicating and interacting with the environment, are gone.
In the past, scientists have analyzed recordings of reefs manually in a process that requires a certain amount of expertise and a lot of patience. For this study, they wanted to pass off some of that work to a computer. Perhaps surprisingly, the artificial intelligence learned to correctly differentiate between healthy and unhealthy reefs almost immediately.
“We didn’t need to feed it many recordings. WE had about 150 minutes from our healthy and degraded reefs, split about 50/50 and that’s all we needed. Then we started trying new recordings and it was able to get about 92% accuracy,” Williams said.
That means that researchers could drop hydrophones around reef systems all over the world and collect data over the course of days or months then play those recordings to the A.I. and get data about the health of the reef over time.
Scientists hope they might be able to use this system to monitor restoration efforts and get an idea of when reefs reach a tipping point at which the A.I. starts to recognize them as healthy again. At present, it’s unclear precisely what the system is listening for to make its determinations.
“With artificial intelligence, it’s kind of a black box. It does its job really well, but we don’t always know what patterns it has learned. Currently, it’s a binary healthy or unhealthy signal,” Williams said.
Going forward, scientists want to pair their artificially intelligent listening system with visual surveys to try and pinpoint exactly when and why a reef moves from unhealthy to healthy or vice versa. It could be the return of a specific species of fish or when coral growth reaches a certain density, or it could be something we’re not even aware of yet, but a better listening system could help unlock those mysteries.
“Typically, when we’re monitoring, we’re only getting a snapshot of the time when we’re there. A real bonus of this is we can just drop the hydrophone in the water, disappear, and come back later. That allows us to get long-term continuous data sets,” Williams said.
Marine ecosystems are particularly well-suited for this sort of work because water is a perfect medium for sound to travel in, but there is potential to take this eavesdropping A.I. into terrestrial environments as well. Animal ecosystems in rainforests and grasslands might also give us hints as to their health based on the richness of their song. We just need to sit back and listen.
No comments:
Post a Comment