IAP-25-063
From Wandering Albatrosses to Hedgehogs: Using AI and Citizen Science to Improve Biodiversity Monitoring under Ground Truth Uncertainty
Technological advances have led ecology into its big data era. While this offers exciting opportunities for biodiversity monitoring, the sheer volume of data requiring manual processing poses a significant challenge. Popular solutions to this challenge: deep learning and citizen science approaches, have various limitations, but their combination may offer an effective and reliable way forward (e.g. for counting wildlife in aerial survey images Torney et al., 2019 and camera trap data Green et al., 2020). Humans are highly data-efficient compared to deep learning approaches, and there remain fundamental differences between human and machine perception, e.g. in the type of errors made (Geirhos et al., 2021) and sensitivity to data corruption (Geirhos et al., 2018). In this project we will compare the performance and biases of classifications obtained through deep learning and citizen science approaches. In particular, we will focus on the setting where there is uncertainty in the labels assigned by volunteers, i.e. different annotators have provided different labels for the same image (Bowler et al. 2020; Hockerts et al. 2025). The British Antarctic Survey (BAS) “Wildlife from Space” group have prepared a satellite image dataset consisting of citizen science annotations of wandering albatrosses (Diomedea exulans) across 11,839 image tiles from South Georgia in the South Atlantic Ocean. This dataset offers an ideal opportunity to study integration of probabilistic AI and citizen science approaches when dealing with uncertainty in detection. Better quantifying and understanding this uncertainty is essential when interpreting satellite survey data for albatrosses and other species (e.g. penguins, seals, whales and walrus), to inform monitoring, conservation and management plans. We will study if our findings generalize to other settings, specifically to camera-trap data provided through the National Hedgehog Monitoring Programme to study the substantial decline of hedgehog (Erinaceus europaeus) numbers in the UK and corresponding citizen science classifications obtained through the MammalWeb platform. In addition, depending on the students’ interest, fieldwork can be conducted to collect drone data, to set up the student’s own Zooniverse project and to be used for further analysis.
Click on an image to expand
Image Captions
A hedgehog (Erinaceus europaeus) detected by a camera-trap and highlighted within a green box. (© National Hedgehog Monitoring Programme),Wandering albatross (Diomedea exulans), South Georgia, Bird Island, BAS.
Methodology
The research team has expertise on AI for biodiversity monitoring; use of citizen science to inform biodiversity conservation; Bayesian deep learning; transfer learning and robustness of machine learning methods under distribution shifts; sustainable AI; collecting, processing, and analysing different types of data including satellite and camera-trap data; computer vision for detecting and segmenting objects from imagery; and multi-modal representation learning that combines information from different sensors and spatial scales.
Research visits: 1x per year to visit co-supervisors at Durham and BAS.
Additional Placement Year 2: One month at the British Antarctic Survey (BAS) to work with co-supervisor Dr Ellen Bowler, the rest of the AI lab at BAS, and the Wildlife from Space research group.
Collaborator: Dr Peter Stewart (current: Postdoc, University of Glasgow, School of Mathematics & Statistics, https://peter-stewart.github.io ). Ecologist and conservationist with expertise in camera trap and citizen science approaches. He created the Prickly Pear Project Kenya on Zooniverse (https://www.zooniverse.org/projects/peter-dot-stewart/prickly-pear-project-kenya), which offers another dataset for the student to engage with, that is relevant to the project (Hockerts et al., 2025).
Collaborator: Dr Paul Eizenhöfer (Senior Lecturer, University of Glasgow, School of Geographical & Earth Sciences, https://www.gla.ac.uk/schools/ges/staff/pauleizenhofer/ ). Expertise in UAV remote sensing techniques.
Fieldwork: Student will have the opportunity to collect multispectral and photogrammetric drone data under the guidance of Dr Eizenhöfer, which can subsequently be used for analysis under guidance from supervisors and Dr Eizenhöfer.
Project Timeline
Year 1
– Comprehensive literature review.
– Complete relevant training and courses (see training & skills section below).
– Develop a working understanding of the wandering albatrosses satellite dataset.
– Compare the performance of citizen scientists and AI models. In particular, the performance of AI models on images that are difficult to classify for humans.
– Submit a workshop paper based on initial results to e.g. Climate Change AI NeurIPS workshop or Computer Vision for Ecology workshop.
Year 2
– Study how AI model performance and biases are affected by model size and various optimization choices, and the impact on developing more sustainable AI.
– Study individual human annotators performance and biases, and how these evolve over time.
– Explore and enhance probabilistic AI approaches for dealing with uncertainty.
– 1 month research visit to BAS.
– Submit paper to suitable venue (e.g. CVPR or Remote Sensing in Ecology and Conservation).
– Fieldwork to collect drone data.
Year 3
– Study if findings generalize to different settings and types of data, specifically camera-trap data from the National Hedgehog Monitoring Programme, and start analysis on drone data.
– Possible extension to data from Prickly Pear Project Kenya.
– Reflect on limitations of metrics used to evaluate models.
– Prepare next paper for publication.
– Research dissemination.
– Start thesis write-up.
Year 3.5
– Finalise papers in progress and complete any remaining analysis.
– Continued research dissemination.
– Thesis write-up and submission.
Training
& Skills
The student will receive training and develop skills on computer vision for ecology, probabilistic AI, coding (Python; PyTorch), GitHub, and working with the different types of data considered in this project.
– The student will receive peer-support by being part of a vibrant PGR community at the University of Glasgow, including students from the Leverhulme DTP on Ecological Data Science who are based in the same School (and of which the first supervisor is on the management committee) and who have a regular data science reading group. Further, there will be group meetings with the supervisor’s other students working on related topics.
– The student will receive support in developing communication and public engagement skills from the supervisor team and University of Glasgow workshops. The student will be encouraged to present in group meetings (including those from collaborating groups), the PGR seminar, the annual BAS student symposium, and later on at conferences.
– The Glasgow University Graduate School has a Doctoral Training Programme which supports students in developing transferable skills, including courses on research integrity and data management. There is also an annual progress review process, where the panel consists of two non-supervisor academic staff members.
– The student will be encouraged to attend seminars, conferences, and summer schools, where of relevance to their project, e.g. the Probabilistic AI for Environmental Science conference and the Student Conference on Conservation Science in Cambridge https://www.sccs-cam.org
– The student will have the opportunity to apply for the NERC BAS Advanced Training Short Course in Svalbard. https://www.bas.ac.uk/science/science-and-students/nerc-doctoral-training-opportunities/bas-advanced-training-short-course/
References & further reading
Bowler, Fretwell, French, Mackiewicz. Using deep learning to count albatrosses from space: Assessing results in light of ground truth uncertainty. Remote Sensing, 12(12):2026, 2020.
Hockerts, Stewart, Vlaar. Camera-Trap Classification With Deep Learning Under Ground Truth Uncertainty. Tackling Climate Change with ML Workshop NeurIPS, 2025.
Torney et al. A comparison of deep learning and citizen science techniques for counting wildlife in aerial survey images. Methods in Ecology and Evolution. 10:779-787, 2019.
Green, Rees, Stephens, Hill, Giordano. Innovations in Camera Trapping Technology and Approaches: The Integration of Citizen Science and Artificial Intelligence. Animals, 10(1):132, 2020.
Geirhos et al. Partial success in closing the gap between human and machine vision. NeurIPS, 2021.
Geirhos et al. Generalisation in humans and deep neural networks. NeurIPS 2018.
The National Hedgehog Monitoring Programme. https://ptes.org/campaigns/hedgehogs/nhmp/
and statement of intent https://www.the-ies.org/analysis/lens-wild-innovations-wildlife-0
BAS News story, https://www.bas.ac.uk/media-post/albatrosses-from-space-wildlife-detectives-needed/
