Mapping Coral Reef Health with Sound

Mark Altaweel


Coral reefs are essential to marine ecosystems, especially in the shallow, tropical waters around the globe. Coral reefs provide abundant food and shelter for thousands of marine species.

Threats facing coral reefs

Coral reefs face significant threats from climate change and ocean acidification, in addition to ongoing human activities. Traditionally, monitoring the health of coral reefs required travel or remote sensing techniques. Now, scientists can utilize a new artificial intelligence (AI) tool that analyzes ocean sound recordings to assess reef health.

Innovative AI tool for coral reef monitoring

Introduction to SurfPerch

Google Research and DeepMind, working with University College London and University of Bristol scientists, have partnered to create a new tool called SurfPerch.[1] 

Colorful tropical fish swim underwater around a reef.
Coral reef fish assemblage, Kealakekua Bay Marine Life Conservation District, Hawai’i Island. Photo: Tim Grabowski, USGS, public domain.

How SurfPerch works – Passive acoustic monitoring

The basic idea is ocean sounds can highlight how well or healthy a coral reef colony is. Passive acoustic monitoring (PAM) involves recording sounds from marine species living around a reef. These sound segments indicate whether a coral reef is actively used by fish and other species.

Free weekly newsletter

Fill out your e-mail address to receive our newsletter!

Training the AI model with ReefSet

Using thousands of hours of crowdscourced sound recordings, this new tool has been trained to detect specific sounds that indicate the health status of coral reef areas. What is powerful about this is that sound can cover a wide area, meaning that the overall health of the reef can be better determined than using other observations that are often limited to very small areas.

The sound dataset, called ReefSet, are annotated recordings that train the classifier into knowing what a healthy reef sounds like. Interestingly, because some bird sound patterns are similar to fish and other marine sounds in providing general ecological health of an area, bird sounds were also used to supplement the training needed to help classify overall reef health.

Testing with and without the supplemented recordings, the cross-domain mixing approach appears to strength the overall classification for coral health. Incidentally, this suggests, potentially, that combining sounds from very different environments in terrestrial or marine ecosystems could be used to monitor the health of varied environments.[2]

Warming ocean waters stress corals and cause coral bleaching. Colonies of “blade fire coral” that have lost their symbiotic algae, or “bleached,” on a reef off of Islamorada, Florida. Photo: Kelsey Roberts, USGS. Public domain.
Warming ocean waters stress corals and cause coral bleaching. Colonies of “blade fire coral” that have lost their symbiotic algae, or “bleached,” on a reef off of Islamorada, Florida. Photo: Kelsey Roberts, USGS. Public domain.

Putting such a project together was not easy, particularly in getting the training data required for the deep learning classification model. The researchers created a website that encouraged members of the public to upload sound data recordings from corals and this provided a large number of recordings.[3] Key datasets used for training include those from the Virgin Islands, Mozambique, Florida, Tanzania, and Kenya.[4] 

Applications and future potential of SurfPerch

Practical uses of SurfPerch

SurfPerch is free for the public to access and can also be improved with new sound recordings. The tool could also be fine-tuned to monitor specific aspects that scientists may determine as important for coral reef health, such as specific marine species that may swim near reefs.

Success Sstories

So far, the tool has been able to distinguish between protected and unprotected reefs in such areas as the Philippines, determine how effective reef restoration efforts have been in Indonesia, and better understand fish community relationships to the Great Barrier Reef in Australia.

A map showing sea surface temperature anomalies with higher temperatures in red.
A map of the sea surface temperature (SST) anomalies off the northeast coast of Australia on March 14, 2022 – many areas were more than 2°C (3.6°F) warmer than normal. The higher temperatures led to a mass bleaching event in the Great Barrier Reef. Map: NASA, public domain.

This tool could be used to study other major coral reefs, and additional data collection is planned to further improve the model’s classification accuracy. What is vital about this effort is that more areas of a coral reef can now be better monitored than previous efforts.

Earlier approaches have focused on monitoring remote sensing data, whereas PAM data are far more effective in capturing overall coral reef health. In previous efforts, machine learning classification was also used to monitor and observe how healthy a reef was.[5]

The Importance of Sound-Based Monitoring

Advantages of Passive Acoustic Monitoring

Creating passive tools, or tools that depend on sound recordings rather than sight-based monitoring, might provide the best long-term way to monitor our increasingly endangered corals. Scientist can more easily cover larger areas and distinguish sounds to determine if a given reef is attracting the types of marine species that would distinguish it as healthy.

Broader Implications for Marine Conservation

Potentially, other marine habitats could also be monitored this way. For now, getting a better understanding of how our coral reefs are changing and if conservation efforts are working is critical in our efforts to save and protect the coral reefs. 


[1]    The deep learning neural network SurfPerch tool used to help classify coral reefs and their health can be found here:  Google | SurfPerch. (n.d.). Kaggle: Your Machine Learning and Data Science Community

[2]    A popular article about SurfPerch can be found here: Perez, S. (2024, June 6). Google looks to AI to help save the coral reefs. Reviews, News & Trailers | Yahoo Movies Canada. 

[3]    Google hosts a website where data recordings can be uploaded to help train the deep learning classification SurfPerch model.

[4]    The academic discussion of how SurfPerch works and steps for training can be found here: Williams, Ben, Bart van Merriënboer, Vincent Dumoulin, Jenny Hamer, Eleni Triantafillou, Abram B. Fleishman, Matthew McKown, et al. 2024. “Leveraging Tropical Reef, Bird and Unrelated Sounds for Superior Transfer Learning in Marine Bioacoustics.” arXiv. DOI: 10.48550/ARXIV.2404.16436.

[5]    For more on an approach using remote sensing data to monitor coral reefs, see: Da Silveira, Camila Brasil Louro, Gil Marcelo Reuss Strenzel, Mauro Maida, Ana Lídia Bertoldi Gaspar, and Beatrice Padovani Ferreira. 2021. “Coral Reef Mapping with Remote Sensing and Machine Learning: A Nurture and Nature Analysis in Marine Protected Areas.” Remote Sensing 13 (15): 2907. DOI: 10.3390/rs13152907.

Photo of author
About the author
Mark Altaweel
Mark Altaweel is a Reader in Near Eastern Archaeology at the Institute of Archaeology, University College London, having held previous appointments and joint appointments at the University of Chicago, University of Alaska, and Argonne National Laboratory. Mark has an undergraduate degree in Anthropology and Masters and PhD degrees from the University of Chicago’s Department of Near Eastern Languages and Civilizations.