4.7 Article

Kernel-attended residual network for single image super-resolution

Journal

KNOWLEDGE-BASED SYSTEMS
Volume 213, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.knosys.2020.106663

Keywords

Single image super-resolution; Convolution neural network; Deep learning; Neural network; Attention mechanism; Learning-based method; Kernel attention

Funding

  1. NSFC, China [61701391, 61872286, 61772407]
  2. ShaanXi Province, China [2018JM6092]

Ask authors/readers for more resources

The paper introduces single image super-resolution as an important computer vision task and proposes a new Kernel-Attended Residual Network (KARN) to address the shortcomings of traditional methods. KARN excels in feature fusion, feature representation, and extracts more advanced information, achieving significantly better performance than existing methods.
Single image super-resolution is very important as a low-level computer vision task. With the development of deep convolution neural networks (CNNs), recent approaches with CNNs have outperformed existing traditional methods in the single image super-resolution (SISR) field. However, these methods may suffer from weaker representational power and overly-smoothing textures. To handle these problems, we propose a Kernel-Attended Residual Network (KARN). Our KARN possesses the optimal performance for feature fusion and feature representation. Specifically, we present a multi-channel fusion block (MCFB) to restore plentiful textual feature information, and a kernel-attended block (KAB) to improve the representation power of our network with multiple kernels. Besides, we present a space-feature re-calibration block (SFRB) to integrate the calibration into features in the spatial aspect. Owing to the advanced information that we extract, KARN achieves a more notable performance than state-of-the-art methods by evaluating the performance of results based on benchmark datasets. (C) 2020 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available