4.7 Article

NestFuse: An Infrared and Visible Image Fusion Architecture Based on Nest Connection and Spatial/Channel Attention Models

Journal

IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT
Volume 69, Issue 12, Pages 9645-9656

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIM.2020.3005230

Keywords

Feature extraction; Image fusion; Training; Task analysis; Decoding; Fuses; Data mining; Attention model; image fusion; infrared image; nest connection; nuclear-norm; visible image

Funding

  1. National Key Research and Development Program of China [2017YFC1601800]
  2. National Natural Science Foundation of China [61672265, U1836218]
  3. 111 Project of Ministry of Education of China [B12018]

Ask authors/readers for more resources

In this article, we propose a novel method for infrared and visible image fusion where we develop nest connection-based network and spatial/channel attention models. The nest connection-based network can preserve significant amounts of information from input data in a multiscale perspective. The approach comprises three key elements: encoder, fusion strategy, and decoder, respectively. In our proposed fusion strategy, spatial attention models and channel attention models are developed that describe the importance of each spatial position and of each channel with deep features. First, the source images are fed into the encoder to extract multiscale deep features. The novel fusion strategy is then developed to fuse these features for each scale. Finally, the fused image is reconstructed by the nest connection-based decoder. Experiments are performed on publicly available data sets. These exhibit that our proposed approach has better fusion performance than other state-of-the-art methods. This claim is justified through both subjective and objective evaluations. The code of our fusion method is available at https://github.com/hli1221/imagefusion-nestfuse.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available