Training a perceptron in a discrete weight space

Michal Rosen-Zvi, Ido Kanter

Research output: Contribution to journalArticlepeer-review

Abstract

Learning in a perceptron having a discrete weight space, where each weight can take [formula presented] different values, is examined analytically and numerically. The learning algorithm is based on the training of the continuous perceptron and prediction following the clipped weights. The learning is described by a new set of order parameters, composed of the overlaps between the teacher and the continuous/clipped students. Different scenarios are examined, among them on-line learning with discrete and continuous transfer functions. The generalization error of the clipped weights decays asymptotically as [formula presented] in the case of on-line learning with binary activation functions and [formula presented] in the case of on-line learning with continuous one, where [formula presented] is the number of examples divided by N, the size of the input vector and K is a positive constant. For finite N and L, perfect agreement between the discrete student and the teacher is obtained for [formula presented] A crossover to the generalization error [formula presented] characterizing continuous weights with binary output, is obtained for synaptic depth [formula presented]

Original languageEnglish
Pages (from-to)9
Number of pages1
JournalPhysical Review E
Volume64
Issue number4
DOIs
StatePublished - 2001

Fingerprint

Dive into the research topics of 'Training a perceptron in a discrete weight space'. Together they form a unique fingerprint.

Cite this