CoralSeg: Learning coral segmentation from sparse annotations

Iñigo Alonso, Matan Yuval, Gal Eyal, Tali Treibitz, Ana C. Murillo

Research output: Contribution to journalArticlepeer-review

53 Scopus citations

Abstract

Robotic advances and developments in sensors and acquisition systems facilitate the collection of survey data in remote and challenging scenarios. Semantic segmentation, which attempts to provide per-pixel semantic labels, is an essential task when processing such data. Recent advances in deep learning approaches have boosted this task's performance. Unfortunately, these methods need large amounts of labeled data, which is usually a challenge in many domains. In many environmental monitoring instances, such as the coral reef example studied here, data labeling demands expert knowledge and is costly. Therefore, many data sets often present scarce and sparse image annotations or remain untouched in image libraries. This study proposes and validates an effective approach for learning semantic segmentation models from sparsely labeled data. Based on augmenting sparse annotations with the proposed adaptive superpixel segmentation propagation, we obtain similar results as if training with dense annotations, significantly reducing the labeling effort. We perform an in-depth analysis of our labeling augmentation method as well as of different neural network architectures and loss functions for semantic segmentation. We demonstrate the effectiveness of our approach on publicly available data sets of different real domains, with the emphasis on underwater scenarios—specifically, coral reef semantic segmentation. We release new labeled data as well as an encoder trained on half a million coral reef images, which is shown to facilitate the generalization to new coral scenarios.

Original languageEnglish
Pages (from-to)1456-1477
Number of pages22
JournalJournal of Field Robotics
Volume36
Issue number8
DOIs
StatePublished - 1 Dec 2019

Bibliographical note

Publisher Copyright:
© 2019 Wiley Periodicals, Inc.

Funding

The authors would like to thank NVIDIA Corporation for the donation of the Titan Xp GPUs used in this study. We thank Aviad Avni for fieldwork assistance and the Interuniversity Institute for Marine Sciences in Eilat for making their facilities available to us. This project was partially funded by the Spanish Government project PGC2018-098817-A-I00, Aragón Regional Government (DGA T45_17R/FSE), and the European Union's Horizon 2020 Research and Innovation Programme under the Marie Skolodowska-Curie grant agreement no. 796025 to G. E; T. T. was supported by the Israel Ministry of National Infrastructures, Energy, and Water Resources (Grant 218-17-008) and Israel Science Foundation (Grant 680/18); M. Y. was supported by the PADI Foundation application no. 32618 and the Murray Foundation for student research. The authors would like to thank NVIDIA Corporation for the donation of the Titan Xp GPUs used in this study. We thank Aviad Avni for fieldwork assistance and the Interuniversity Institute for Marine Sciences in Eilat for making their facilities available to us. This project was partially funded by the Spanish Government project PGC2018‐098817‐A‐I00, Aragón Regional Government (DGA T45_17R/FSE), and the European Union's Horizon 2020 Research and Innovation Programme under the Marie Skolodowska‐Curie grant agreement no. 796025 to G. E; T. T. was supported by the Israel Ministry of National Infrastructures, Energy, and Water Resources (Grant 218‐17‐008) and Israel Science Foundation (Grant 680/18); M. Y. was supported by the PADI Foundation application no. 32618 and the Murray Foundation for student research.

FundersFunder number
Aragón Regional GovernmentDGA T45_17R/FSE
Interuniversity Institute for Marine Sciences in EilatPGC2018‐098817‐A‐I00
Marie Skolodowska-Curie
PADI Foundation32618
NVIDIA
Horizon 2020 Framework Programme
Israel Science Foundation680/18
Horizon 2020796025
Ministry of National Infrastructure, Energy and Water Resources218‐17‐008

    Keywords

    • coral reefs
    • learning
    • machine learning
    • perception
    • underwater robotics

    Fingerprint

    Dive into the research topics of 'CoralSeg: Learning coral segmentation from sparse annotations'. Together they form a unique fingerprint.

    Cite this