Gator: Customizable Channel Pruning of Neural Networks with Gating

Eli Passov, Eli O. David, Nathan S. Netanyahu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


The rise of neural network (NN) applications has prompted an increased interest in compression, with a particular focus on channel pruning, which does not require any additional hardware. Most pruning methods employ either single-layer operations or global schemes to determine which channels to remove followed by fine-tuning of the network. In this paper we present Gator, a channel-pruning method which temporarily adds learned gating mechanisms for pruning of individual channels, and which is trained with an additional auxiliary loss, aimed at reducing the computational cost due to memory, (theoretical) speedup (in terms of FLOPs), and practical, hardware-specific speedup. Gator introduces a new formulation of dependencies between NN layers which, in contrast to most previous methods, enables pruning of non-sequential parts, such as layers on ResNet’s highway, and even removing entire ResNet blocks. Gator’s pruning for ResNet-50 trained on ImageNet produces state-of-the-art (SOTA) results, such as 50 % FLOPs reduction with only 0.4 % -drop in top-5 accuracy. Also, Gator outperforms previous pruning models, in terms of GPU latency by running 1.4 times faster. Furthermore, Gator achieves improved top-5 accuracy results, compared to MobileNetV2 and SqueezeNet, for similar runtimes.

Original languageEnglish
Title of host publicationArtificial Neural Networks and Machine Learning – ICANN 2021 - 30th International Conference on Artificial Neural Networks, Proceedings
EditorsIgor Farkaš, Paolo Masulli, Sebastian Otte, Stefan Wermter
PublisherSpringer Science and Business Media Deutschland GmbH
Number of pages13
ISBN (Print)9783030863791
StatePublished - 2021
Event30th International Conference on Artificial Neural Networks, ICANN 2021 - Virtual, Online
Duration: 14 Sep 202117 Sep 2021

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12894 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference30th International Conference on Artificial Neural Networks, ICANN 2021
CityVirtual, Online

Bibliographical note

Publisher Copyright:
© 2021, Springer Nature Switzerland AG.


Dive into the research topics of 'Gator: Customizable Channel Pruning of Neural Networks with Gating'. Together they form a unique fingerprint.

Cite this