Neuron-based Network Pruning Based on Majority Voting
Ali Alqahtani, Xianghua Xie, Ehab Essa and Mark Jones
Abstract
The achievement of neural networks in a variety of applications is accompanied by a dramatic increase in computational costs and memory requirements. In this paper, we propose an efficient method to simultaneously identify the critical neurons and prune the model during training without involving any pre-training or fine-tuning procedures. Unlike existing methods, which accomplish this task in a greedy fashion, we propose a majority voting technique to compare the activation values among neurons and assign a voting score to quantitatively evaluate their importance. This mechanism helps to effectively reduce model complexity by eliminating the less influential neurons and aims to determine a subset of the whole model that can represent the reference model with much fewer parameters within the training process. Experimental results show that majority voting efficiently compresses the network with no drop in model accuracy, pruning more than 79% of the original model parameters on CIFAR10 and more than 91% of the original parameters on MNIST. Moreover, we show that with our proposed method, sparse models can be further pruned into even smaller models by removing more than 60% of the parameters, whilst preserving the reference model accuracy.
Related Files
DOI
10.1109/ICPR48806.2021.9412897
https://dx.doi.org/10.1109/ICPR48806.2021.9412897
Citation
Ali Alqahtani, Xianghua Xie, Ehab Essa and Mark Jones, Neuron-based Network Pruning Based on Majority Voting, 2020 25th International Conference on Pattern Recognition (ICPR), 2021, pp. 2874-2881. https://dx.doi.org/10.1109/ICPR48806.2021.9412897
BibTeX
@inproceedings{NetworkPruning, author={Ali Alqahtani and Xianghua Xie and Ehab Essa and Mark Jones}, booktitle={2020 25th International Conference on Pattern Recognition (ICPR)}, title={Neuron-based Network Pruning Based on Majority Voting}, pages={3090-3097}, date={2021-01-10}, doi={10.1109/ICPR48806.2021.9412897}, }