IAP-25-064

Sounding out the deep: using AI and passive acoustic monitoring to reveal spatial and temporal patterns in deep-sea ecosystems

Deep-sea habitats such as cold-water coral reefs, gardens, and sponge grounds are structurally complex ecosystems that serve as biodiversity and nutrient cycling hotspots. They provide essential habitat for a wide range of sessile and motile species, including commercially valuable fish and invertebrates [1]. Due to their slow-growing and long-lived nature, they are particularly vulnerable to human activities and climate change. Climate change models predict major decreases in suitable habitats for deep-sea species, for example, a 79% decrease is projected by 2100 for the dominant reef-forming coral in the Atlantic Ocean, Desmophyllum pertusum (aka Lophelia pertusa) [2].

Understanding how these threats affect the deep sea is critical in developing management measures. However, this remains a challenging research topic. Due to the difficult, expensive and sometimes extractive nature of collecting data in these remote and wild places, it has often been the case that only sporadic data is collected over a short time (seconds-hours). Underwater passive acoustic monitoring (PAM) offers a transformative, novel, cost-effective and non-lethal method [3] to monitor deep-sea habitats at a high temporal resolution and for long periods (months-years) [4].

Passive acoustic monitoring (PAM) devices deployed on benthic moorings provide a unique opportunity to “listen in” on deep-sea ecosystems over extended periods of time. These long-term acoustic datasets are transforming how we monitor marine biodiversity, offering new insights into how communities differ among habitats and how they respond to natural variability, human disturbance, and climate change [e.g. 4,5,6]. Yet, the sheer volume of recordings makes manual analysis for species presence impractical. Artificial intelligence now offers powerful ways to automate this process, enabling researchers to detect, classify, and interpret biological sounds more efficiently [e.g. 7,8]. Despite rapid progress in shallow-water environments, deep-sea benthic habitat applications remain largely unexplored [3], representing an opportunity for innovation in both marine ecology and computational acoustics.

Project Aims:
1. Characterise deep-sea soundscapes across distinct habitats.
Investigate and compare the acoustic environments of three key deep-sea habitats – cold-water coral reefs, coral gardens, and sponge grounds. The student will build a comprehensive, sound library of these habitats and describe their unique acoustic signatures using a suite of spectral and soundscape metrics.

2. Develop and apply an AI-driven acoustic analysis pipeline.
Design and optimise deep learning models for automated detection and classification of biological sounds. The student will refine cluster- and object-based detection approaches to efficiently process large acoustic datasets and reveal patterns in deep-sea biological activity.

3. Explore spatial and temporal dynamics in deep-sea soundscapes.
Use advanced univariate and multivariate statistical analyses to identify how soundscapes vary across habitats and over time, and determine the key environmental and biological drivers of these patterns. Biodiversity data available [9] or newly extracted from time-lapse image data will also be used to understand the relationships between the acoustic communities and those seen in the images.

The student will benefit from Dr De Clippele’s expertise in deep-sea habitat biodiversity, PAM and time-lapse image analysis, habitat use, and machine learning; Dr Oedekoven’s strengths in PAM, spatial and temporal ecology, Dr Vlaar’s specialisation in the mathematics of deep learning and climate change AI, and Dr Smith’s expertise in image analysis and interactive machine learning.

The PhD student will have the opportunity to join a research expedition to Iceland and be part of international projects (Use of passive acoustics to quantify fish biodiversity and habitat use, DFO Canada; and ALONGate, Senckenberg Institute) that focus on understanding the importance of deep-sea habitats for biodiversity and commercially important species, at a high temporal resolution. This project will train the student to become a future leader at the intersection of AI and marine science, with field experience and skills.

Click on an image to expand

Image Captions

Icelandic cold-water coral reef at 600 m depth. Credit. Prof Saskia Brix, IceAGE3 expedition 2020

Methodology

1. Characterise deep-sea soundscapes across distinct habitats.
Passive Acoustic Monitoring (PAM) data from a Scottish, Norwegian and Icelandic cold-water coral reef, Canadian Gully sea pen coral gardens and the Canadian Sambro bank sponge grounds will be used. The student will have the opportunity to join a month- long research expedition collecting the Icelandic cold-water coral reef data in the summer of 2027. Audio recordings will be downsampled and clipped into defined temporal units for analysis. The soundscapes will be characterised in terms of their sound pressure levels, power spectral density and in terms of acoustic complexity indices using software such as PAMGuide, R and/or Matlab. Spectrograms will be visually and computationally inspected in Raven Pro to identify key biotic (fish calls, crustacean rasps) and anthropogenic (ship noise, sonar) components of the soundscape. A deep-sea habitat sound library will be created, along with a dichotomous key as a resource for future identifications. Together, these datasets and resources will establish a foundational reference for interpreting and comparing deep-sea soundscapes across diverse habitats.

2. Develop and apply an AI-driven acoustic analysis pipeline.
An automated pipeline will be developed to detect and classify biological sounds in deep-sea acoustic recordings. The student will evaluate and compare multiple AI-based approaches, including cluster-based unsupervised learning methods and single- and multi-object detection algorithms based on existing deep learning architectures. The pipeline will be trained and validated using pre-annotated acoustic datasets containing known biological sounds (e.g., fish calls, crustacean rasps). Existing models (e.g. [7,8]) will be fine-tuned and adapted to existing and newly collected deep-sea habitat data through transfer learning. Model performance will be assessed using metrics such as precision, recall and F1-score. The most effective model(s) will be integrated into an automated processing workflow capable of handling large PAM datasets. Together, these methodological improvements will provide a robust framework for scalable, accurate, and efficient detection of biological sounds in deep-sea environments

3. Explore spatial and temporal dynamics in deep-sea soundscapes.
Manual annotations (Objective 1) and automated detections (Objective 2) will be used to quantify spatial and temporal patterns in biological sound production across deep-sea habitats. Acoustic metrics (e.g., sound pressure levels) and biological sound occurrence rates will be analysed across diel, seasonal, and habitat scales. To identify environmental and ecological drivers of soundscape variation, both univariate (e.g., generalised linear and random forest models) and multivariate (e.g., Redundancy Analysis) statistical approaches will be applied. Environmental variables such as temperature, current speed, depth, and food supply will be included in these analyses. Existing [9] and/or newly annotated biodiversity data from time-lapse images may also be incorporated to explain soundscape patterns, using tools such as BIIGLE and interactive machine-learning tools like RootPainter to automate species annotation [10]. These integrated analyses will provide a comprehensive understanding of how biotic and abiotic factors shape deep-sea acoustic environments through space and time.

Project Timeline

Year 1

– Conduct a literature review, understanding what species may be associated with the deep-sea habitats, and exploring what proportion of these species make or may make sound using the global library of underwater sounds database.
– Characterise the deep-sea soundscapes across the different habitats.
– Manually annotate for the presence of different types of biological and anthropogenic sounds.
– Receive training in acoustic processing and annotation.
– Attend and present at MASTS annual science meeting
– Write up publication

Year 2

– Develop cluster-based unsupervised deep learning models.
– Develop single-and/or multi-object deep learning models.
– Receive training in machine learning methods.
– Write up publication.
– Sea Survival course
– Join month-long research expedition to Icelandic cold-water coral reef
– Attend and present at MASTS annual science meeting

Year 3

Annotate additional time-lapse image data, which can be used as a variable to explain patterns in the soundscape
– Conduct uni-and multivariate analyses to understand what drives differences in the soundscape and the acoustic communities associated with the different deep-sea habitats.
– Receive training in using BIIGLE, RootPainter and advanced statistical methods.
– Attend and present at MASTS annual science meeting
– Attend and present at the International Symposium of Deep Sea Corals

Year 3.5

– Write up PhD and publications

Training
& Skills

• Gain taxonomic expertise of species associated with cold-water coral reefs and gardens, through receiving training from Dr De Clippele
• Become proficient in processing and annotating PAM and image data, through receiving training from Drs De Clippele, Oedekoven, Smith
• Become proficient in using a variety of manual and cutting-edge machine learning tools to analyse large image datasets, through receiving training from Drs Vlaar and Smith and accessing University of Glasgow courses such as “Python programming” and “Data Mining and Machine Learning – Supervised and Unsupervised Learning”.
• Develop expertise in using a variety of temporal and spatial statistical approaches to analyse complex long-term species and community data sets, through receiving training from Drs De Clippele and Oedekoven and through having access to a range of University of Glasgow and PGR courses,“Data Analysis and Visualisation in R”.
• Gain experience in sea-going expeditions and in deploying a variety of monitoring tools, through participating in a sea survival course and a research expedition to Iceland with Dr De Clippele.
• Develop strong organisational and data management skills, through working with and collecting new acoustic data sets, with support of all supervisors.
• Develop skills to communicate across disciplines and learn to integrate interdisciplinary datasets, through working across disciplines and with international partners, and support from all supervisors.
• Learn to articulate findings clearly and effectively, through preparing the thesis and scientific manuscripts for publication in peer-reviewed journals and PGR training on topics such “Creating a Thesis”, and through support from all supervisors.
• Gain experience in public speaking and scientific communication, through presenting at academic conferences and at public engagement events and through engaging with PGR courses on topics such as “polishing presentation skills”, “Communicating visually with data”, and through support from all supervisors.
• Build project management skills, including planning timelines and troubleshooting technical challenges, through leading a project that involves large datasets from multiple locations, and through being involved in expedition planning, and through support from all supervisors.

References & further reading

[1] Kutti, T., Bergstad, O.A., Fosså, J.H. and Helle, K., 2014. Cold-water coral mounds and sponge-beds as habitats for demersal fish on the Norwegian shelf. Deep Sea Research Part II: Topical Studies in Oceanography, 99, pp.122-133.[2] Morato, T., González‐Irusta, J.M., Dominguez‐Carrió, C., Wei, C.L., Davies, A., Sweetman, A.K., Taranto, G.H., Beazley, L., García‐Alegre, A., Grehan, A. and Laffargue, P., 2020. Climate‐induced changes in the suitable habitat of cold‐water corals and commercially important deep‐sea fishes in the North Atlantic. Global Change Biology, 26(4), pp.2181-2202.[3] Havlik, M.N., Predragovic, M. and Duarte, C.M., 2022. State of play in marine soundscape assessments. Frontiers in Marine Science, 9, p.919418.[4] De Clippele, LH and Risch, D., 2021. Measuring sound at a cold-water coral reef to assess the impact of COVID-19 on noise pollution. Frontiers in Marine Science , 8 , p.674702.[5] Lamont, T.A., Williams, B., Chapuis, L., Prasetya, M.E., Seraphim, M.J., Harding, H.R., May, E.B., Janetski, N., Jompa, J., Smith, D.J. and Radford, A.N., 2022. The sound of recovery: Coral reef restoration success is detectable in the soundscape. Journal of Applied Ecology, 59(3), pp.742-756.[6] Bolgan, M., Di Iorio, L., Dailianis, T., Catalan, I.A., Lejeune, P., Picciulin, M. and Parmentier, E., 2022. Fish acoustic community structure in Neptune seagrass meadows across the Mediterranean basin. Aquatic Conservation: Marine and Freshwater Ecosystems, 32(2), pp.329-347.[7] Mouy, X., Archer, S.K., Dosso, S., Dudas, S., English, P., Foord, C., Halliday, W., Juanes, F., Lancaster, D., Van Parijs, S. and Haggarty, D., 2024. Automatic detection of unidentified fish sounds: A comparison of traditional machine learning with deep learning. Frontiers in Remote Sensing, 5, p.1439995.[8] Best, P., Paris, S., Glotin, H. and Marxer, R., 2023. Deep audio embeddings for vocalisation clustering. Plos one, 18(7), p.e0283396.[9] De Clippele, L.H., Nozères, C., Xu, J., MacDonald, B., Lirette, C., Phelan, K., Staniforth, C., Whoriskey, F., Wolff, G.A., Blackbird, S. and Mohn, C., 2025. Fish use of deep-sea sponge habitats evidenced by long-term high-resolution monitoring. Scientific reports, 15(1), p.17656.[10] Clark, H.P., Smith, A.G., McKay Fletcher, D., Larsson, A.I., Jaspars, M. and De Clippele, L.H., 2024. New interactive machine learning tool for marine image analysis. Royal Society Open Science, 11(5), p.231678.

Apply Now