Abstract
A back-propagation learning algorithm is examined numerically for feedforward multilayer networks with one-hidden-layer functions as a parity machine or as a committeemachine of the internal representation of the hidden units. It is found that the maximal knowntheoretical capacity is saturated and that the convergent time is not exponential with the size ofthe system. The results also indicate the possibility of a replica-symmetry-breaking phase withthe lack of local minima.
Original language | English |
---|---|
Pages (from-to) | 501-506 |
Number of pages | 6 |
Journal | EPL |
Volume | 21 |
Issue number | 4 |
DOIs | |
State | Published - 1 Feb 1993 |