Author
Listed:
- Ben Williams
- Santiago M Balvanera
- Sarab S Sethi
- Timothy AC Lamont
- Jamaluddin Jompa
- Mochyudho Prasetya
- Laura Richardson
- Lucille Chapuis
- Emma Weschke
- Andrew Hoey
- Ricardo Beldade
- Suzanne C Mills
- Anne Haguenauer
- Frederic Zuberer
- Stephen D Simpson
- David Curnick
- Kate E Jones
Abstract
Passive acoustic monitoring can offer insights into the state of coral reef ecosystems at low-costs and over extended temporal periods. Comparison of whole soundscape properties can rapidly deliver broad insights from acoustic data, in contrast to detailed but time-consuming analysis of individual bioacoustic events. However, a lack of effective automated analysis for whole soundscape data has impeded progress in this field. Here, we show that machine learning (ML) can be used to unlock greater insights from reef soundscapes. We showcase this on a diverse set of tasks using three biogeographically independent datasets, each containing fish community (high or low), coral cover (high or low) or depth zone (shallow or mesophotic) classes. We show supervised learning can be used to train models that can identify ecological classes and individual sites from whole soundscapes. However, we report unsupervised clustering achieves this whilst providing a more detailed understanding of ecological and site groupings within soundscape data. We also compare three different approaches for extracting feature embeddings from soundscape recordings for input into ML algorithms: acoustic indices commonly used by soundscape ecologists, a pretrained convolutional neural network (P-CNN) trained on 5.2 million hrs of YouTube audio, and CNN’s which were trained on each individual task (T-CNN). Although the T-CNN performs marginally better across tasks, we reveal that the P-CNN offers a powerful tool for generating insights from marine soundscape data as it requires orders of magnitude less computational resources whilst achieving near comparable performance to the T-CNN, with significant performance improvements over the acoustic indices. Our findings have implications for soundscape ecology in any habitat.Author summary: Artificial intelligence has the potential to revolutionise bioacoustic monitoring of coral reefs. So far, only a limited set of work has used machine learning to train detectors for specific sounds such as individual fish species. However, building detectors is a time-consuming process that involves manually annotating large amounts of audio, followed by complicated model training. This process must then be repeated for any new dataset. Instead, we explore machine learning techniques for whole soundscape analysis, comparing the acoustic properties of raw recordings from the entire habitat. We identify multiple machine learning methods for whole soundscape analysis and rigorously test these using datasets from Indonesia, Australia and French Polynesia. Our findings reveal that use of a neural network pretrained on 5.2 million hours of unrelated YouTube audio offers a powerful tool to produce compressed representations of reef audio data, conserving the data’s key properties whilst being executable on a standard personal laptop. These representations can then be used to explore patterns in reef soundscapes using unsupervised machine learning, which is effective at grouping similar recordings together. We show these groupings hold relationships with ground truth ecological data, including coral cover, the fish community and depth.
Suggested Citation
Ben Williams & Santiago M Balvanera & Sarab S Sethi & Timothy AC Lamont & Jamaluddin Jompa & Mochyudho Prasetya & Laura Richardson & Lucille Chapuis & Emma Weschke & Andrew Hoey & Ricardo Beldade & Su, 2025.
"Unlocking the soundscape of coral reefs with artificial intelligence: pretrained networks and unsupervised learning win out,"
PLOS Computational Biology, Public Library of Science, vol. 21(4), pages 1-18, April.
Handle:
RePEc:plo:pcbi00:1013029
DOI: 10.1371/journal.pcbi.1013029
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pcbi00:1013029. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ploscompbiol (email available below). General contact details of provider: https://journals.plos.org/ploscompbiol/ .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.