4.6 Article

Neuron pruning in temporal domain for energy efficient SNN processor design

Journal

FRONTIERS IN NEUROSCIENCE
Volume 17, Issue -, Pages -

Publisher

FRONTIERS MEDIA SA
DOI: 10.3389/fnins.2023.1285914

Keywords

spiking neural network; approximation; computation reduction; input-dependent neuron pruning; neuromorphic

Categories

Ask authors/readers for more resources

This paper proposes an input-dependent computation reduction approach by pruning relatively unimportant neurons in order to reduce computational complexity while maintaining high accuracy. The proposed neuron pruning scheme achieves significant energy reduction and speed up with acceptable accuracy loss.
Recently, the accuracy of spike neural network (SNN) has been significantly improved by deploying convolutional neural networks (CNN) and their parameters to SNN. The deep convolutional SNNs, however, suffer from large amounts of computations, which is the major bottleneck for energy efficient SNN processor design. In this paper, we present an input-dependent computation reduction approach, where relatively unimportant neurons are identified and pruned without seriously sacrificing the accuracies. Specifically, a neuron pruning in temporal domain is proposed that prunes less important neurons and skips its future operations based on the layer-wise pruning thresholds of membrane voltages. To find the pruning thresholds, two pruning threshold search algorithms are presented that can efficiently trade-off accuracy and computational complexity with a given computation reduction ratio. The proposed neuron pruning scheme has been implemented using 65 nm CMOS process. The SNN processor achieves a 57% energy reduction and a 2.68x speed up, with up to 0.82% accuracy loss and 7.3% area overhead for CIFAR-10 dataset.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available