Large-scale imaging techniques are used increasingly for ecological surveys. However, manual analysis can be prohibitively expensive, creating a bottleneck between collected images and desired data-products. This bottleneck is particularly severe for benthic surveys, where millions of images are obtained each year. Recent automated annotation methods may provide a solution, but reflectance images do not always contain sufficient information for adequate classification accuracy. In this work, the FluorIS, a low-cost modified consumer camera, was used to capture wide-band wide-field-of-view fluorescence images during a field deployment in Eilat, Israel. The fluorescence images were registered with standard reflectance images, and an automated annotation method based on convolutional neural networks was developed. Our results demonstrate a 22% reduction of classification error-rate when using both images types compared to only using reflectance images. The improvements were large, in particular, for coral reef genera Platygyra, Acropora and Millepora, where classification recall improved by 38%, 33%, and 41%, respectively. We conclude that convolutional neural networks can be used to combine reflectance and fluorescence imagery in order to significantly improve automated annotation accuracy and reduce the manual annotation bottleneck.
Bibliographical noteFunding Information:
This work was supported in part by the US National Science Foundation (NSF) Division of Ocean Sciences (OCE) grant No. 09-41760 to BGM and DK, by the Israel Science Foundation (ISF) grant No. 341/12 to YL by The Leona M. and Harry B. Helmsley Charitable Trust and The Maurice Hatter Foundation to TT, and by the National Oceanic and Atmospheric Administration (NOAA) grant No. NA10OAR4320156 to OB. We thank Ben Herzberg and Tali Mass for help with dives, Charles H. Maze1 for valuable advice on the design of FluorIS, and gratefully acknowledge the support of NVIDIA Corporation for their donation of the Tesla K40 GPU used in this research.