Abstract
We study model feed forward networks as time series predictors
in the stationary limit. The focus is on complex, yet nonchaotic,
behavior. The main question we address is whether the asymptotic
behavior is governed by the architecture, regardless the details of
the weights. We
nd hierarchies among classes of architectures
with respect to the attractor dimension of the long term sequence
they are capable of generating;fl larger number of hidden units can
generate higher dimensional attractors. In the case of a perceptron, we develop the stationary solution for general weights, and show
that the flow is typically one dimensional. The relaxation time
from an arbitrary initial condition to the stationary solution is
found to scale linearly with the size of the network. In multilayer
networks, the number of hidden units gives bounds on the number
and dimension of the possible attractors. We conclude that long
term prediction
(in the nonchaotic regime) with such models is
governed by attractor dynamics related to the architecture.
Original language | American English |
---|---|
Title of host publication | Advances in Neural Information Processing Systems 10 |
Editors | Jordan MI, Kearns MJ, Solla SA |
Place of Publication | Cambridge, MA |
Publisher | The MIT Press |
Pages | 315-321 |
State | Published - 1998 |