Putting eagle rays on the map by coupling aerial video-surveys and deep learning
Résumé
Reliable and efficient techniques are urgently needed to monitor elasmobranch populations that face increasing threats worldwide. Aerial video-surveys provide precise and verifiable observations for the rapid assessment of species distribution and abundance in coral reefs, but the manual processing of videos is a major bottleneck for timely conservation applications. In this study, we applied deep learning for the automated detection and mapping of vulnerable eagle rays from aerial videos. A light aircraft dedicated to touristic flights allowed us to collect 42 h of aerial video footage over a shallow coral lagoon in New Caledonia (Southwest Pacific). We extracted the videos at a rate of one image per second before annotating them, yielding 314 images with eagle rays. We then trained a convolutional neural network with 80% of the eagle ray images and evaluated its accuracy on the remaining 20% (independent data sets). Our deep learning model detected 92% of the annotated eagle rays in a diversity of habitats and acquisition conditions across the studied coral lagoon. Our study offers a potential breakthrough for the monitoring of ray populations in coral reef ecosystems by providing a fast and accurate alternative to the manual processing of aerial videos. Our deep learning approach can be extended to the detection of other elasmobranchs and applied to systematic aerial surveys to not only detect individuals but also estimate species density in coral reef habitats.
Origine | Fichiers produits par l'(les) auteur(s) |
---|