We show how deep learning methods can be applied in the context of crowdsourcing and unsupervised ensemble learning. First, we prove that the popular model of Dawid and Skene, which assumes that all classifiers are conditionally independent, is equivalent to a Restricted Boltzmann Machine (RBM) with a single hidden node. Hence, under this model, the posterior probabilities of the true labels can be instead estimated via a trained RBM. Next, to address the more general case, where classifiers may strongly violate the conditional independence assumption, we propose to apply RBM-based Deep Neural Net (DNN). Experimental results on various simulated and real-world datasets demonstrate that our proposed DNN approach outperforms other state-of-the-art methods, in particular when the data violates the conditional independence assumption.
|Title of host publication||33rd International Conference on Machine Learning, ICML 2016|
|Editors||Maria Florina Balcan, Kilian Q. Weinberger|
|Publisher||International Machine Learning Society (IMLS)|
|Number of pages||14|
|State||Published - 2016|
|Event||33rd International Conference on Machine Learning, ICML 2016 - New York City, United States|
Duration: 19 Jun 2016 → 24 Jun 2016
|Name||33rd International Conference on Machine Learning, ICML 2016|
|Conference||33rd International Conference on Machine Learning, ICML 2016|
|City||New York City|
|Period||19/06/16 → 24/06/16|
Bibliographical noteFunding Information:
The authors thank George Linderman, Alex Cloninger, Tingting Jiang, Raphy Coifman, Sahand Negahban, Andrew Barron, Alex Kovner, Shahar Kovalsky, Maria Angelica Cueto, Jason Morton, and Bernd Sturmfels for their help. This research was funded by the Intel Collaborative Research Institute for Computational Intelligence (B.N.) and by NIH grant 1R01HG008383-01 Al (Y.K. and B.N.).