Neural network recognition of marine benthos and corals

Alina Raphael, Zvy Dubinsky, David Iluz, Nathan S. Netanyahu

Research output: Contribution to journalReview articlepeer-review

32 Scopus citations

Abstract

We present thorough this review the developments in the field, point out their current limitations, and outline its timelines and unique potential. In order to do so we introduce the methods used in each of the advances in the application of deep learning (DL) to coral research that took place between the years: 2016-2018. DL has unique capability of streamlining the description, analysis, and monitoring of coral reefs, saving time, and obtaining higher reliability and accuracy compared with error-prone human performance. Coral reefs are the most diverse and complex of marine ecosystems, undergoing a severe decline worldwide resulting from the adverse synergistic influences of global climate change, ocean acidification, and seawater warming, exacerbated by anthropogenic eutrophication and pollution. DL is an extension of some of the concepts originating from machine learning that join several multilayered neural networks. Machine learning refers to algorithms that automatically detect patterns in data. In the case of corals these data are underwater photographic images. Based on "learned" patterns, such programs can recognize new images. The novelty of DL is in the use of state-of-art computerized image analyses technologies, and its fully automated methodology of dealing with large data sets of images. Automated Image recognition refers to technologies that identify and detect objects or attributes in a digital video or image automatically. Image recognition classifies data into selected categories out of many. We show that Neural Network methods are already reliable in distinguishing corals from other benthos and non-coral organisms. Automated recognition of live coral cover is a powerful indicator of reef response to slow and transient changes in the environment. Improving automated recognition of coral species, DL methods already recognize decline of coral diversity due to natural and anthropogenic stressors. Diversity indicators can document the effectiveness of reef bioremediation initiatives. We explored the current applications of deep learning for corals and benthic image classification by discussing the most recent studies conducted by researchers. We review the developments in the field, point out their current limitations, and outline their timelines and unique potential. We also discussed a few future research directions in the fields of deep learning. Future needs are the age detection of single species, in order to track trends in their population recruitment, decline, and recovery. Fine resolution, at the polyp level, is still to be developed, in order to allow separation of species with similar macroscopic features. That refinement of DL will allow such comparisons and their analyses. We conclude that the usefulness of future, more refined automatic identification will allow reef comparison, and tracking long term changes in species diversity. The hitherto unused addition of intraspecific coral color parameters, will add the inclusion of physiological coral responses to environmental conditions and change thereof. The core aim of this review was to underscore the strength and reliability of the DL approach for documenting coral reef features based on an evaluation of the currently available published uses of this method. We expect that this review will encourage researchers from computer vision and marine societies to collaborate on similar long-term joint ventures.

Original languageEnglish
Article number29
JournalDiversity
Volume12
Issue number1
DOIs
StatePublished - 1 Jan 2020

Bibliographical note

Publisher Copyright:
© 2020 by the authors.

Keywords

  • Classification
  • Coral reef
  • Coral species
  • Deep learning
  • Marine ecosystem

Fingerprint

Dive into the research topics of 'Neural network recognition of marine benthos and corals'. Together they form a unique fingerprint.

Cite this