Neuronal normalization provides effective learning through ineffective synaptic learning rules

Gal Chechik, Isaac Meilijson, Eytan Ruppin

Research output: Contribution to journalConference articlepeer-review

Abstract

We revisit here the classical neuroscience paradigm of Hebbian learning showing that a necessary requirement for effective associative memory learning is that the efficacies of the incoming synapses should be uncorrelated. This is difficult to achieve in a robust manner by Hebbian synaptic learning, since it depends on network level information. Effective learning can yet be achieved by a neuronal process that maintains a zero sum of the incoming synaptic efficacies. This normalization drastically improves the memory capacity of associative networks, from an essentially bounded capacity to one that linearly scales with the network's size. Such neuronal normalization can be successfully carried out by activity-dependent homeostasis of the neuron's synaptic efficacies, which was recently observed in cortical tissue. Thus, our findings suggest that effective associative learning with Hebbian synapses alone is biologically implausible and that Hebbian synapses must be continuously remodeled by neuronally driven regulatory processes in the brain. (C) 2000 Elsevier Science B.V. All rights reserved.

Original languageEnglish
Pages (from-to)345-351
Number of pages7
JournalNeurocomputing
Volume32-33
DOIs
StatePublished - Jun 2000
Externally publishedYes
EventThe 8th Annual Computational Neuroscience Meeting (CNS'99) - Pittsburgh, PA, USA
Duration: 18 Jul 199922 Jul 1999

Keywords

  • Hebbian learning
  • Neuronal regulation
  • Synaptic plasticity
  • Weight normalization

Fingerprint

Dive into the research topics of 'Neuronal normalization provides effective learning through ineffective synaptic learning rules'. Together they form a unique fingerprint.

Cite this