3.8 Proceedings Paper

KernelNet: A Blind Super-Resolution Kernel Estimation Network

Publisher

IEEE COMPUTER SOC
DOI: 10.1109/CVPRW53098.2021.00056

Keywords

-

Ask authors/readers for more resources

Recently developed deep neural network methods have shown remarkable performance in the super resolution problem, but their performance drops significantly on real-world images. Techniques for blind super resolution kernel estimation, such as KernelGAN, show promise but are limited by complexity for real-time applications.
Recently developed deep neural network methods have achieved remarkable performance in the Super Resolution problem when applied to Low Resolution (LR) images that are obtained from High Resolution (HR) images with ideal and predefined downsampling processing, i.e., convolution with a known blurring kernel that is followed by subsampling (e.g., Bicubic). However, when these algorithms are applied to real-world images whose downsampling pattern is unknown, unlike synthetically generated LR-HR image pairs, their performance drops drastically. Blind SR problem can be defined as real-world image SR when the downsampling blurring kernel (SR kernel) is unknown. The recent SR kernel estimation techniques like KernelGAN have shown promising results in this direction. However, their limited recovery performance and high computational complexity make them unsuitable for real-time usage, like for applications in mobile cameras. This paper proposes a modular and interpretable neural network structure, KernelNet, for the blind SR kernel estimation problem. The proposed model outperforms the state-of-the-art SR kernel estimator, KernelGAN, by a significant margin in SR kernel reconstruction accuracy. Moreover, to the best of our knowledge, the proposed algorithm is the first one that can estimate the SR kernel in real-time by performing O(1k) times faster than KernelGAN.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available