4.6 Article

Neuron pruning in temporal domain for energy efficient SNN processor design

期刊

FRONTIERS IN NEUROSCIENCE
卷 17, 期 -, 页码 -

出版社

FRONTIERS MEDIA SA
DOI: 10.3389/fnins.2023.1285914

关键词

spiking neural network; approximation; computation reduction; input-dependent neuron pruning; neuromorphic

向作者/读者索取更多资源

This paper proposes an input-dependent computation reduction approach by pruning relatively unimportant neurons in order to reduce computational complexity while maintaining high accuracy. The proposed neuron pruning scheme achieves significant energy reduction and speed up with acceptable accuracy loss.
Recently, the accuracy of spike neural network (SNN) has been significantly improved by deploying convolutional neural networks (CNN) and their parameters to SNN. The deep convolutional SNNs, however, suffer from large amounts of computations, which is the major bottleneck for energy efficient SNN processor design. In this paper, we present an input-dependent computation reduction approach, where relatively unimportant neurons are identified and pruned without seriously sacrificing the accuracies. Specifically, a neuron pruning in temporal domain is proposed that prunes less important neurons and skips its future operations based on the layer-wise pruning thresholds of membrane voltages. To find the pruning thresholds, two pruning threshold search algorithms are presented that can efficiently trade-off accuracy and computational complexity with a given computation reduction ratio. The proposed neuron pruning scheme has been implemented using 65 nm CMOS process. The SNN processor achieves a 57% energy reduction and a 2.68x speed up, with up to 0.82% accuracy loss and 7.3% area overhead for CIFAR-10 dataset.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据