Abstract
In this paper, we present a deep neural network (DNN) training approach called the “DeepMimic” training method. Enormous amounts of data are available nowadays for training usage. Yet, only a tiny portion of these data is manually labeled, whereas almost all of the data are unlabeled. The training approach presented utilizes, in a most simplified manner, the unlabeled data to the fullest, in order to achieve remarkable (classification) results. Our DeepMimic method uses a small portion of labeled data and a large amount of unlabeled data for the training process, as expected in a real-world scenario. It consists of a mentor model and a student model. Employing a mentor model trained on a small portion of the labeled data and then feeding it only with unlabeled data, we show how to obtain a (simplified) student model that reaches the same accuracy and loss as the mentor model, on the same test set, without using any of the original data labels in the training of the student model. Our experiments demonstrate that even on challenging classification tasks the student network architecture can be simplified significantly with a minor influence on the performance, i.e., we need not even know the original network architecture of the mentor. In addition, the time required for training the student model to reach the mentor’s performance level is shorter, as a result of a simplified architecture and more available data. The proposed method highlights the disadvantages of regular supervised training and demonstrates the benefits of a less traditional training approach.
Original language | English |
---|---|
Title of host publication | Artificial Neural Networks and Machine Learning – ICANN 2019 |
Subtitle of host publication | Workshop and Special Sessions - 28th International Conference on Artificial Neural Networks, Proceedings |
Editors | Vera Kurková, Igor V. Tetko, Pavel Karpov, Fabian Theis |
Publisher | Springer Verlag |
Pages | 440-455 |
Number of pages | 16 |
ISBN (Print) | 9783030304928 |
DOIs | |
State | Published - 2019 |
Event | 28th International Conference on Artificial Neural Networks, ICANN 2019 - Munich, Germany Duration: 17 Sep 2019 → 19 Sep 2019 |
Publication series
Name | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
---|---|
Volume | 11731 LNCS |
ISSN (Print) | 0302-9743 |
ISSN (Electronic) | 1611-3349 |
Conference
Conference | 28th International Conference on Artificial Neural Networks, ICANN 2019 |
---|---|
Country/Territory | Germany |
City | Munich |
Period | 17/09/19 → 19/09/19 |
Bibliographical note
Publisher Copyright:© Springer Nature Switzerland AG 2019.