4.6 Article

Structured feature sparsity training for convolutional neural network compression

出版社

ACADEMIC PRESS INC ELSEVIER SCIENCE
DOI: 10.1016/j.jvcir.2020.102867

关键词

Convolutional neural network; CNN compression; Structured sparsity; Pruning criterion

资金

  1. National Key Research and Development Program of China [2016YFB1200401]

向作者/读者索取更多资源

Convolutional neural networks (CNNs) with large model size and computing operations are difficult to be deployed on embedded systems, such as smartphones or AI cameras. In this paper, we propose a novel structured pruning method, termed the structured feature sparsity training (SFST), to speed up the inference process and reduce the memory usage of CNNs. Unlike other existing pruning methods, which require multiple iterations of pruning and retraining to ensure stable performance, SFST only needs to fine-tune the pretrained model with additional regularization on the less important features and then prune them, no multiple pruning and retraining needed. SFST can be deployed to a variety of modern CNN architectures including VGGNet, ResNet and MobileNetv2. Experimental results on CIFAR, SVHN, ImageNet and MSTAR benchmark dataset demonstrate the effectiveness of our scheme, which achieves superior performance over the state-of-the-art methods.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据