Abstract
We consider an ensemble of K single-layer perceptrons exposed to random inputs and investigate the conditions under which the couplings of these perceptrons can be chosen such that prescribed correlations between the outputs occur. A general formalism is introduced using a multiperceptron cost function that allows one to determine the maximal number of random inputs as a function of the desired values of the correlations. Replica-symmetric results for K=2 and K=3 are compared with properties of two-layer networks of tree-structure and fixed Boolean function between hidden units and output. The results show which correlations in the hidden layer of multilayer neural networks are crucial for the value of the storage capacity.
Original language | English |
---|---|
Pages (from-to) | 7369-7378 |
Number of pages | 10 |
Journal | Physical Review E |
Volume | 55 |
Issue number | 6 |
DOIs | |
State | Published - 1997 |