Learning on tree architectures outperforms a convolutional feedforward network

Yuval Meir, Itamar Ben-Noam, Yarden Tzach, Shiri Hodassman, Ido Kanter

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

Advanced deep learning architectures consist of tens of fully connected and convolutional hidden layers, currently extended to hundreds, are far from their biological realization. Their implausible biological dynamics relies on changing a weight in a non-local manner, as the number of routes between an output unit and a weight is typically large, using the backpropagation technique. Here, a 3-layer tree architecture inspired by experimental-based dendritic tree adaptations is developed and applied to the offline and online learning of the CIFAR-10 database. The proposed architecture outperforms the achievable success rates of the 5-layer convolutional LeNet. Moreover, the highly pruned tree backpropagation approach of the proposed architecture, where a single route connects an output unit and a weight, represents an efficient dendritic deep learning.

Original languageEnglish
Article number962
JournalScientific Reports
Volume13
Issue number1
DOIs
StatePublished - 30 Jan 2023

Bibliographical note

Publisher Copyright:
© 2023, The Author(s).

Funding

I.K. acknowledges partial financial support of the Israel Ministry Science and Technology, via the grant, Brain-inspired ultra-fast and ultra-sharp machines for AI-assisted healthcare, via collaboration between Italy and Israel. S.H. acknowledges the support of the Israel Ministry Science and Technology.

FundersFunder number
Ministry of science and technology, Israel

    Fingerprint

    Dive into the research topics of 'Learning on tree architectures outperforms a convolutional feedforward network'. Together they form a unique fingerprint.

    Cite this