TY - JOUR
T1 - Training a perceptron in a discrete weight space
AU - Rosen-Zvi, Michal
AU - Kanter, Ido
PY - 2001
Y1 - 2001
N2 - Learning in a perceptron having a discrete weight space, where each weight can take [formula presented] different values, is examined analytically and numerically. The learning algorithm is based on the training of the continuous perceptron and prediction following the clipped weights. The learning is described by a new set of order parameters, composed of the overlaps between the teacher and the continuous/clipped students. Different scenarios are examined, among them on-line learning with discrete and continuous transfer functions. The generalization error of the clipped weights decays asymptotically as [formula presented] in the case of on-line learning with binary activation functions and [formula presented] in the case of on-line learning with continuous one, where [formula presented] is the number of examples divided by N, the size of the input vector and K is a positive constant. For finite N and L, perfect agreement between the discrete student and the teacher is obtained for [formula presented] A crossover to the generalization error [formula presented] characterizing continuous weights with binary output, is obtained for synaptic depth [formula presented]
AB - Learning in a perceptron having a discrete weight space, where each weight can take [formula presented] different values, is examined analytically and numerically. The learning algorithm is based on the training of the continuous perceptron and prediction following the clipped weights. The learning is described by a new set of order parameters, composed of the overlaps between the teacher and the continuous/clipped students. Different scenarios are examined, among them on-line learning with discrete and continuous transfer functions. The generalization error of the clipped weights decays asymptotically as [formula presented] in the case of on-line learning with binary activation functions and [formula presented] in the case of on-line learning with continuous one, where [formula presented] is the number of examples divided by N, the size of the input vector and K is a positive constant. For finite N and L, perfect agreement between the discrete student and the teacher is obtained for [formula presented] A crossover to the generalization error [formula presented] characterizing continuous weights with binary output, is obtained for synaptic depth [formula presented]
UR - http://www.scopus.com/inward/record.url?scp=85035266871&partnerID=8YFLogxK
U2 - 10.1103/PhysRevE.64.046109
DO - 10.1103/PhysRevE.64.046109
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:85035266871
SN - 1063-651X
VL - 64
SP - 9
JO - Physical Review E
JF - Physical Review E
IS - 4
ER -