Hebbian dreaming for small datasets

Elena Agliari, Francesco Alemanno, Miriam Aquaro, Adriano Barra, Fabrizio Durante, Ido Kanter

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

The dreaming Hopfield model constitutes a generalization of the Hebbian paradigm for neural networks, that is able to perform on-line learning when “awake” and also to account for off-line “sleeping” mechanisms. The latter have been shown to enhance storing in such a way that, in the long sleep-time limit, this model can reach the maximal storage capacity achievable by networks equipped with symmetric pairwise interactions. In this paper, we inspect the minimal amount of information that must be supplied to such a network to guarantee a successful generalization, and we test it both on random synthetic and on standard structured datasets (i.e., MNIST, Fashion-MNIST and Olivetti). By comparing these minimal thresholds of information with those required by the standard (i.e., always “awake”) Hopfield model, we prove that the present network can save up to ∼90% of the dataset size, yet preserving the same performance of the standard counterpart. This suggests that sleep may play a pivotal role in explaining the gap between the large volumes of data required to train artificial neural networks and the relatively small volumes needed by their biological counterparts. Further, we prove that the model Cost function (typically used in statistical mechanics) admits a representation in terms of a standard Loss function (typically used in machine learning) and this allows us to analyze its emergent computational skills both theoretically and computationally: a quantitative picture of its capabilities as a function of its control parameters is achieved and consistency between the two approaches is highlighted. The resulting network is an associative memory for pattern recognition tasks that learns from examples on-line, generalizes correctly (in suitable regions of its control parameters) and optimizes its storage capacity by off-line sleeping: such a reduction of the training cost can be inspiring toward sustainable AI and in situations where data are relatively sparse.

Original languageEnglish
Article number106174
JournalNeural Networks
Volume173
DOIs
StatePublished - May 2024

Bibliographical note

Publisher Copyright:
© 2024 The Author(s)

Funding

The Authors acknowledge financial support by Ministero per gli Affari Esteri e la Collaborazione Internazionale (MAECI, Italy) and to the Ministry of Science, Technology and Space (MOST, Israel) for the shared grant for the Scientific and Technical Collaboration between Israel and Italy, BULBUL, Brain Inspired Ultra-fast and Ultra-sharp machines for health-care, Project n. F85F21006230001. EA and MA are in Sapienza University of Rome that does not belong to the BULBUL project and they are grateful to Sapienza University of Rome ( RM120172B8066CB0 , RM12117A8590B3FA ) for financial support. EA acknowledges financial support from PNRR MUR project PE0000013-FAIR . FD and FA are in Salento but do not belong to the BULBUL project and they are grateful to MUR via the PRIN grants Stochastic Methods for Complex Systems n. 2017JFFHS and Statistical Mechanics of Learning Machines: from algorithmic and information-theoretical limits to new biologically inspired paradigms n. 20229T9EAT. The Authors acknowledge financial support by Ministero per gli Affari Esteri e la Collaborazione Internazionale (MAECI, Italy) and to the Ministry of Science, Technology and Space (MOST, Israel) for the shared grant for the Scientific and Technical Collaboration between Israel and Italy, BULBUL, Brain Inspired Ultra-fast and Ultra-sharp machines for health-care, Project n. F85F21006230001. EA and MA are in Sapienza University of Rome that does not belong to the BULBUL project and they are grateful to Sapienza University of Rome (RM120172B8066CB0, RM12117A8590B3FA) for financial support. EA acknowledges financial support from PNRR MUR project PE0000013-FAIR. FD and FA are in Salento but do not belong to the BULBUL project and they are grateful to MUR via the PRIN grants Stochastic Methods for Complex Systems n. 2017JFFHS and Statistical Mechanics of Learning Machines: from algorithmic and information-theoretical limits to new biologically inspired paradigms n. 20229T9EAT.

FundersFunder number
Ministero per gli Affari Esteri e la Collaborazione Internazionale
PNRR MURPE0000013-FAIR
Ministry of Science, Technology and SpaceF85F21006230001
Ministero dell’Istruzione, dell’Università e della Ricerca2017JFFHS, 20229T9EAT
Sapienza Università di RomaRM12117A8590B3FA, RM120172B8066CB0

    Keywords

    • Hebbian learning
    • Hopfield model
    • Sleeping phenomena
    • Statistical mechanics

    Fingerprint

    Dive into the research topics of 'Hebbian dreaming for small datasets'. Together they form a unique fingerprint.

    Cite this