On the equivalence of two-layered perceptrons with binary neurons.

M. Blatt, E. Domany, I. Kanter

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

We consider two-layered perceptions consisting of N binary input units, K binary hidden units and one binary output unit, in the limit N >> K > or = 1. We prove that the weights of a regular irreducible network are uniquely determined by its input-output map up to some obvious global symmetries. A network is regular if its K weight vectors from the input layer to the K hidden units are linearly independent. A (single layered) perceptron is said to be irreducible if its output depends on every one of its input units; and a two-layered perceptron is irreducible if the K + 1 perceptrons that constitute such network are irreducible. By global symmetries we mean, for instance, permuting the labels of the hidden units. Hence, two irreducible regular two-layered perceptrons that implement the same Boolean function must have the same number of hidden units, and must be composed of equivalent perceptrons.

Original languageEnglish
Pages (from-to)225-231
Number of pages7
JournalInternational Journal of Neural Systems
Volume6
Issue number3
DOIs
StatePublished - Sep 1995
Externally publishedYes

Fingerprint

Dive into the research topics of 'On the equivalence of two-layered perceptrons with binary neurons.'. Together they form a unique fingerprint.

Cite this