3.8 Proceedings Paper

Coarse-to-Fine Sparse Transformer for Hyperspectral Image Reconstruction

Journal

COMPUTER VISION - ECCV 2022, PT XVII
Volume 13677, Issue -, Pages 686-704

Publisher

SPRINGER INTERNATIONAL PUBLISHING AG
DOI: 10.1007/978-3-031-19790-1_41

Keywords

Compressive imaging; Transformer; Image restoration

Funding

  1. NSFC fund [61831 014]
  2. Shenzhen Science and Technology Project [JSGG20 210802153150005, CJGJZD20200617102601004]
  3. Westlake Foundation [2021B1 501-2]

Ask authors/readers for more resources

This paper proposes a novel Transformer-based method that embeds sparsity in deep learning for hyperspectral imaging reconstruction. By using a coarse-to-fine strategy, the method utilizes a spectra-aware screening mechanism and a spectra-aggregation hashing multi-head self-attention to achieve high-quality results in HSI reconstruction.
Many learning-based algorithms have been developed to solve the inverse problem of coded aperture snapshot spectral imaging (CASSI). However, CNN-based methods show limitations in capturing long-range dependencies. Previous Transformer-based methods densely sample tokens, some of which are uninformative, and calculate multi-head self-attention (MSA) between some tokens that are unrelated in content. In this paper, we propose a novel Transformerbased method, coarse-to-fine sparse Transformer (CST), firstly embedding HSI sparsity into deep learning for HSI reconstruction. In particular, CST uses our proposed spectra-aware screening mechanism (SASM) for coarse patch selecting. Then the selected patches are fed into our customized spectra-aggregation hashing multi-head self-attention (SAH-MSA) for fine pixel clustering and self-similarity capturing. Comprehensive experiments show that our CST significantly outperforms stateof-the-art methods while requiring cheaper computational costs. https://github.com/caiyuanhao1998/MST

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available