No matter where they are located, coral reefs all around the world make ‘music’. That is, they create their own symphony of sounds that can be picked up by an hydrophone – an underwater microphone. The symphony members include all the creatures living on and around the reef, including fish and corals as well as many other types of animals.
Corals are, in fact, animals that attach themselves to something solid and unmoving, where they spend their entire lives anchored to one spot. When a coral dies it leaves behind a rock-like skeleton which adds to the size and mass of the reef. Young coral attach themselves to this skeleton, and the process repeats with a new cycle.
All the life forms inhabiting a reef contribute to the sounds that can be heard and recorded in the vicinity.
Researchers discovered that the ‘song’ of the reef gives a good indication of its overall health. This makes sense if you consider that the healthier the reef, the more animals that live there, thus more ‘music’ that is picked by a hydrophone. However, the recordings are complex and difficult to interpret, and because their are many recordings of different coral reefs in different conditions ranging from badly damaged to healthy and restored, it’s extremely challenging for humans to recognize or detect patterns.
To conquer this monumental task, an artificial intelligence (AI) algorithm was created and ‘taught’ to recognize the difference between the sounds of a healthy reef and a damaged reef. The AI succeds with a 92% accuracy rate – much better than humans. Also, utilizing underwater sound recordings and AI is much more cost effective, accurate, and efficient than sending human divers to the reefs for a visual estimation of reef health: many reef animals are hiding during the day, resulting in undercounts.
“What Are Coral Reefs And What’s Their Purpose?” (4:43)