“Exploring the Potential of AI to Conserve Coral Reefs: Google takes a closer look”

Google has developed a new AI tool to help marine biologists better understand coral reef ecosystems and their health, which can aid in conversation efforts. The project began by inviting the public to listen to reef sounds via the web. By crowdsourcing this activity, Google was able to create a library of new fish sounds that were used to fine-tune the AI tool, SurfPerch. Although bird sounds and reef recordings are very different, there were common patterns between bird songs and fish sounds that the model was able to learn from, they found. The project continues today, as new audio is added to the Calling in Our Corals website, which will help to further train the AI model, Google says.

Google has recently announced the development of a groundbreaking AI tool, SurfPerch, that can provide valuable insights into the health and ecosystems of coral reefs. This innovative tool aims to aid in conservation efforts by allowing marine biologists to “hear reef health from the inside.”

“This allows us to analyze new datasets with far more efficiency than previously possible, removing the need for training on expensive GPU processors and opening new opportunities to understand reef communities and conservation of these,” notes a Google blog post about the project.

The inspiration for this project came from Google’s “Calling in our Corals” website, where the public was invited to listen to thousands of hours of reef audio and identify fish sounds. Through this crowdsourced effort, Google was able to create a comprehensive “bioacoustic” data set focused on reef health. This data set was then used to fine-tune SurfPerch’s capabilities, allowing it to quickly detect and analyze new reef sounds.

Co-authored by Steve Simpson, a professor of Marine Biology at the University of Bristol, and Ben Williams, a marine biologist at the University College London, the Google blog post highlights the potential impact of this tool. Both researchers have dedicated their careers to studying coral ecosystems, with a focus on areas such as climate change and restoration.

The team also discovered that they could improve SurfPerch’s model performance by incorporating bird recordings into the training process. Despite the obvious differences between bird songs and fish sounds, the model was able to identify common patterns. This development has further expanded SurfPerch’s capabilities and opened up new possibilities for understanding reef communities.

After combining the “Calling Our Corals” data with SurfPerch in initial trials, researchers were able to uncover differences between protected and unprotected reefs in the Philippines, track restoration outcomes in Indonesia, and gain a better understanding of the fish communities on the Great Barrier Reef.

The project is an ongoing effort, with new audio continuously being added to the “Calling in Our Corals” website. Google believes that this will further enhance and refine SurfPerch’s AI model, leading to even more valuable insights for marine biologists studying coral reef ecosystems.

Google Image Credits

Avatar photo
Kira Kim

Kira Kim is a science journalist with a background in biology and a passion for environmental issues. She is known for her clear and concise writing, as well as her ability to bring complex scientific concepts to life for a general audience.

Articles: 867

One comment

Leave a Reply

Your email address will not be published. Required fields are marked *