Training deep neural-networks based on unreliable labels

Alan Joseph Bekker, Jacob Goldberger

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

139 Scopus citations

Abstract

In this study we address the problem of training a neural network based on data with unreliable labels. We introduce an extra noise layer by assuming that the observed labels were created from the true labels by passing through a noisy channel whose parameters are unknown. We propose a method that simultaneously learns both the neural network parameters and the noise distribution. The proposed method is compared to standard back-propagation neural-network training that ignores the existence of wrong labels. The improved classification performance of the method is illustrated on several standard classification tasks. In particular we show that in some cases our approach can be beneficial even when the labels are set manually and assumed to be error-free.

Original languageEnglish
Title of host publication2016 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2682-2686
Number of pages5
ISBN (Electronic)9781479999880
DOIs
StatePublished - 18 May 2016
Event41st IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016 - Shanghai, China
Duration: 20 Mar 201625 Mar 2016

Publication series

NameICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
Volume2016-May
ISSN (Print)1520-6149

Conference

Conference41st IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2016
Country/TerritoryChina
CityShanghai
Period20/03/1625/03/16

Bibliographical note

Publisher Copyright:
© 2016 IEEE.

Keywords

  • back-propagation
  • deep-learning
  • noisy labels

Fingerprint

Dive into the research topics of 'Training deep neural-networks based on unreliable labels'. Together they form a unique fingerprint.

Cite this