Abstract
In this paper, we revisit the recurrent back- propagation (RBP) algorithm (Almeida, 1987; Pineda, 1987), discuss the conditions under which it applies as well as how to satisfy them in deep neural networks. We show that RBP can be unstable and propose two variants based on conjugate gradient on the normal equations (CG-RBP) and Neumann series (Neumann-RBP). We further in-vestigate the relationship between Neumann-RBP and back propagation through time (BPTT) and its truncated version (TBPTT). Our Neumann- RBP has the same time complexity as TBPTT but only requires constant memory, whereas TBPTT's memory cost scales linearly with the number of truncation steps. We examine all RBP variants, along with BPTT and TBPTT, in three different application domains: Associative memory with continuous Hopfield networks, document classifi-cation in citation networks using graph neural networks, and hyperparameter optimization for fully connected networks. All experiments demonstrate that RBPs, especially the Neumann-RBP variant, are efficient and effective for optimizing convergent recurrent neural networks.
Original language | English |
---|---|
Title of host publication | 35th International Conference on Machine Learning, ICML 2018 |
Editors | Jennifer Dy, Andreas Krause |
Publisher | International Machine Learning Society (IMLS) |
Pages | 4807-4820 |
Number of pages | 14 |
ISBN (Electronic) | 9781510867963 |
State | Published - 2018 |
Externally published | Yes |
Event | 35th International Conference on Machine Learning, ICML 2018 - Stockholm, Sweden Duration: 10 Jul 2018 → 15 Jul 2018 |
Publication series
Name | 35th International Conference on Machine Learning, ICML 2018 |
---|---|
Volume | 7 |
Conference
Conference | 35th International Conference on Machine Learning, ICML 2018 |
---|---|
Country/Territory | Sweden |
City | Stockholm |
Period | 10/07/18 → 15/07/18 |
Bibliographical note
Publisher Copyright:© Copyright 2018 by the author(s).