Journal
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
Volume 45, Issue 9, Pages 10974-10989Publisher
IEEE COMPUTER SOC
DOI: 10.1109/TPAMI.2023.3268675
Keywords
Hessian-aided regularization; neural network pruning; single image super-resolution; structured sparsity
Ask authors/readers for more resources
This article presents a method called Global Aligned Structured Sparsity Learning (GASSL) to tackle the problem of efficient image super-resolution (SR). The method includes two major components: Hessian-Aided Regularization (HAIR) and Aligned Structured Sparsity Learning (ASSL). GASSL outperforms other recent methods in terms of efficiency, as demonstrated by extensive results.
Efficient image super-resolution (SR) has witnessed rapid progress thanks to novel lightweight architectures or model compression techniques (e.g., neural architecture search and knowledge distillation). Nevertheless, these methods consume considerable resources or/and neglect to squeeze out the network redundancy at a more fine-grained convolution filter level. Network pruning is a promising alternative to overcome these shortcomings. However, structured pruning is known to be tricky when applied to SR networks because the extensive residual blocks demand the pruned indices of different layers to be the same. Besides, the principled determination of proper layerwise sparsities remains challenging too. In this article, we present Global Aligned Structured Sparsity Learning (GASSL) to resolve these problems. GASSL has two major components: Hessian-Aided Regularization (HAIR) and Aligned Structured Sparsity Learning (ASSL). HAIR is a regularization-based sparsity auto-selection algorithm with Hessian considered implicitly. A proven proposition is introduced to justify its design. ASSL is for physically pruning SR networks. Particularly, a new penalty term Sparsity Structure Alignment (SSA) is proposed to align the pruned indices of different layers. With GASSL, we design two new efficient single image SR networks of different architecture genres, pushing the efficiency envelope of SR models one step forward. Extensive results demonstrate the merits of GASSL over other recent counterparts.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available