Abstract
We revisit here the classical neuroscience paradigm of Hebbian learning showing that a necessary requirement for effective associative memory learning is that the efficacies of the incoming synapses should be uncorrelated. This is difficult to achieve in a robust manner by Hebbian synaptic learning, since it depends on network level information. Effective learning can yet be achieved by a neuronal process that maintains a zero sum of the incoming synaptic efficacies. This normalization drastically improves the memory capacity of associative networks, from an essentially bounded capacity to one that linearly scales with the network's size. Such neuronal normalization can be successfully carried out by activity-dependent homeostasis of the neuron's synaptic efficacies, which was recently observed in cortical tissue. Thus, our findings suggest that effective associative learning with Hebbian synapses alone is biologically implausible and that Hebbian synapses must be continuously remodeled by neuronally driven regulatory processes in the brain. (C) 2000 Elsevier Science B.V. All rights reserved.
Original language | English |
---|---|
Pages (from-to) | 345-351 |
Number of pages | 7 |
Journal | Neurocomputing |
Volume | 32-33 |
DOIs | |
State | Published - Jun 2000 |
Externally published | Yes |
Event | The 8th Annual Computational Neuroscience Meeting (CNS'99) - Pittsburgh, PA, USA Duration: 18 Jul 1999 → 22 Jul 1999 |
Keywords
- Hebbian learning
- Neuronal regulation
- Synaptic plasticity
- Weight normalization