4.7 Article

DAIS: Automatic Channel Pruning via Differentiable Annealing Indicator Search

期刊

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNNLS.2022.3161284

关键词

Annealing; Computational modeling; Computer architecture; Optimization; Costs; Task analysis; Search problems; Automatic channel pruning; annealing-relaxed channel indicator; differentiable indicator search; model compression

向作者/读者索取更多资源

This article introduces a channel pruning method for neural networks based on differentiable annealing indicator search (DAIS), which can automatically search for an effective pruned model with given constraints on computation overhead. DAIS relaxes the binarized channel indicators to be continuous and jointly learns both indicators and model parameters through bi-level optimization. DAIS also proposes an annealing-based procedure and various regularizations to control the pruning sparsity and improve model performance.
The convolutional neural network (CNN) has achieved great success in fulfilling computer vision tasks despite large computation overhead against efficient deployment. Channel pruning is usually applied to reduce the model redundancy while preserving the network structure, such that the pruned network can be easily deployed in practice. However, existing channel pruning methods require hand-crafted rules, which can result in a degraded model performance with respect to the tremendous potential pruning space given large neural networks. In this article, we introduce differentiable annealing indicator search (DAIS) that leverages the strength of neural architecture search in the channel pruning and automatically searches for the effective pruned model with given constraints on computation overhead. Specifically, DAIS relaxes the binarized channel indicators to be continuous and then jointly learns both indicators and model parameters via bi-level optimization. To bridge the non-negligible discrepancy between the continuous model and the target binarized model, DAIS proposes an annealing-based procedure to steer the indicator convergence toward binarized states. Moreover, DAIS designs various regularizations based on a priori structural knowledge to control the pruning sparsity and to improve model performance. Experimental results show that DAIS outperforms state-of-the-art pruning methods on CIFAR-10, CIFAR-100, and ImageNet.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据