4.7 Article

UMAG-Net: A New Unsupervised Multiattention-Guided Network for Hyperspectral and Multispectral Image Fusion

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/JSTARS.2021.3097178

Keywords

Tensors; Image fusion; Hyperspectral imaging; Spatial resolution; Feature extraction; Image reconstruction; Dictionaries; Deep learning; hyperspectral images (HSIs); image fusion; multispectral images (MSIs)

Funding

  1. Natural Science Foundation of Hebei Province [F2020201025, F2019201151, F2018210148]
  2. Science Research Project of Hebei Province [BJ2020030, QN2017306]
  3. National Natural Science Foundation of China [61572063, 62172003]

Ask authors/readers for more resources

An unsupervised multiattention-guided network named UMAG-Net is proposed for HSI-MSI fusion without training data, achieving image reconstruction with high spatial and spectral resolution.
To reconstruct images with high spatial resolution and high spectral resolution, one of the most common methods is to fuse a low-resolution hyperspectral image (HSI) with a high-resolution (HR) multispectral image (MSI) of the same scene. Deep learning has been widely applied in the field of HSI-MSI fusion, which is limited with hardware. In order to break the limits, we construct an unsupervised multiattention-guided network named UMAG-Net without training data to better accomplish HSI-MSI fusion. UMAG-Net first extracts deep multiscale features of MSI by using a multiattention encoding network. Then, a loss function containing a pair of HSI and MSI is used to iteratively update parameters of UMAG-Net and learn prior knowledge of the fused image. Finally, a multiscale feature-guided network is constructed to generate an HR-HSI. The experimental results show the visual and quantitative superiority of the proposed method compared to other methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available