Abstract
The learing time of a simple neural-network model is obtained through an analytic computation of the eigenvalue spectrum for the Hessian matrix, which describes the second-order properties of the objective function in the space of coupling coefficients. The results are generic for symmetric matrices obtained by summing outer products of random vectors. The form of the eigenvalue distribution suggests new techniques for accelerating the learning process, and provides a theoretical justification for the choice of centered versus biased state variables.
Original language | American English |
---|---|
Journal | Physical Review Letters |
Volume | 66 |
Issue number | 18 |
State | Published - 1991 |