4.7 Article

Image fusion based on generative adversarial network consistent with perception

Journal

INFORMATION FUSION
Volume 72, Issue -, Pages 110-125

Publisher

ELSEVIER
DOI: 10.1016/j.inffus.2021.02.019

Keywords

Image fusion; Generative adversarial networks; Dense block; Infrared image; Visible image

Funding

  1. National Natural Science Foundation of China [62020106012, U1836218, 61672265]
  2. 111 Project of Ministry of Education of China [B12018]

Ask authors/readers for more resources

This paper proposes a new fusion method based on dense blocks and GAN to improve the performance of deep learning networks, using structural similarity and gradient loss functions for training. Experimental results show that the fused images obtained through this method perform well in terms of evaluation indicators and have better visual effects.
Deep learning is a rapidly developing approach in the field of infrared and visible image fusion. In this context, the use of dense blocks in deep networks significantly improves the utilization of shallow information, and the combination of the Generative Adversarial Network (GAN) also improves the fusion performance of two source images. We propose a new method based on dense blocks and GANs , and we directly insert the input image-visible light image in each layer of the entire network. We use structural similarity and gradient loss functions that are more consistent with perception instead of mean square error loss. After the adversarial training between the generator and the discriminator, we show that a trained end-to-end fusion network ? the generator network ? is finally obtained. Our experiments show that the fused images obtained by our approach achieve good score based on multiple evaluation indicators. Further, our fused images have better visual effects in multiple sets of contrasts, which are more satisfying to human visual perception.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available