4.3 Article

Unsupervised change detection method based on saliency analysis and convolutional neural network

Journal

JOURNAL OF APPLIED REMOTE SENSING
Volume 13, Issue 2, Pages -

Publisher

SPIE-SOC PHOTO-OPTICAL INSTRUMENTATION ENGINEERS
DOI: 10.1117/1.JRS.13.024512

Keywords

change detection; deep feature representation; saliency detection; convolutional neural network; multiscale fusion

Funding

  1. National Natural Science Foundation of China [41801386, 41671454]
  2. Startup Project for Introducing Talent of NUIST [2018r029]
  3. Natural Science Foundation of Jiangsu Province [BK20180797]

Ask authors/readers for more resources

Due to great advantages in deep features representation and classification for image data, deep learning is becoming increasingly popular for change detection (CD) in the remote-sensing community. An unsupervised CD method is proposed by combining deep features representation, saliency detection, and convolutional neural network (CNN). First, bitemporal images are fed into the pretrained CNN model for deep features extraction and difference image generation. Second, multiscale saliency detection is adopted to implement the uncertainty analysis for the difference image, where image pixels can be categorized into three classes: changed, unchanged, and uncertain. Then, a flexible CNN model is constructed and trained using the interested changed and unchanged pixels, and the change type of the uncertain pixels can be determined by the CNN model. Finally, object-based refinement and multiscale fusion strategies are utilized to generate the final change map. The effectiveness and reliability of our CD method are verified on three very high-resolution datasets, and the experimental results show that our proposed approach outperforms the other state-of-the-art CD methods in terms of five quantitative metrics. (C) 2019 Society of Photo-Optical Instrumentation Engineers (SPIE)

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.3
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available