4.6 Article

Depth super-resolution from explicit and implicit high-frequency features

Journal

COMPUTER VISION AND IMAGE UNDERSTANDING
Volume 237, Issue -, Pages -

Publisher

ACADEMIC PRESS INC ELSEVIER SCIENCE
DOI: 10.1016/j.cviu.2023.103841

Keywords

Guided depth super-resolution; CNN; Transformer; Multi-scale; High-frequency information

Ask authors/readers for more resources

Guided depth super-resolution is a technique that aims to restore a high-resolution depth map using a low-resolution depth map and an associated high-resolution RGB image. Current methods still face challenges in restoring precise and sharp edges near depth discontinuities and fine structures. To address this issue, we propose a novel multi-stage depth super-resolution network that progressively reconstructs high-resolution depth maps using explicit and implicit high-frequency information.
Guided depth super-resolution aims at using a low-resolution depth map and an associated high-resolution RGB image to recover a high-resolution depth map. However, restoring precise and sharp edges near depth discontinuities and fine structures is still challenging for state-of-the-art methods. To alleviate this issue, we propose a novel multi-stage depth super-resolution network, which progressively reconstructs HR depth maps from explicit and implicit high-frequency information. We introduce an efficient transformer to obtain explicit high-frequency information. The shape bias and global context of the transformer allow our model to focus on high-frequency details between objects, i.e., depth discontinuities, rather than texture within objects. Furthermore, we project the input color images into the frequency domain for additional implicit high-frequency cues extraction. Finally, to incorporate the structural details, we develop a fusion strategy that combines depth features and high-frequency information in the multi-stage-scale framework. Exhaustive experiments on the main benchmarks show that our approach establishes a new state-of-the-art. Code will be publicly available at https://github.com/wudiqx106/DSR-EI.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available