Journal
ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING
Volume 129, Issue -, Pages 212-225Publisher
ELSEVIER SCIENCE BV
DOI: 10.1016/j.isprsjprs.2017.05.001
Keywords
Ternary change detection; Deep learning; Synthetic aperture radar; Representation learning; Sparse autoencoder; Convolutional neural networks
Categories
Funding
- National Natural Science Foundation of China [61422209]
- National Program for Support of Top-notch Young Professionals of China
- Specialized Research Fund for the Doctoral Program of Higher Education [20130203110011]
Ask authors/readers for more resources
Ternary change detection aims to detect changes and group the changes into positive change and negative change. It is of great significance in the joint interpretation of spatial-temporal synthetic aperture radar images. In this study, sparse autoencoder, convolutional neural networks (CNN) and unsupervised clustering are combined to solve ternary change detection problem without any supervison. Firstly, sparse autoencoder is used to transform log-ratio difference image into a suitable feature space for extracting key changes and suppressing outliers and noise. And then the learned features are clustered into three classes, which are taken as the pseudo labels for training a CNN model as change feature classifier. The reliable training samples for CNN are selected from the feature maps learned by sparse autoencoder with certain selection rules. Having training samples and the corresponding pseudo labels, the CNN model can be trained by using back propagation with stochastic gradient descent. During its training procedure, CNN is driven to learn the concept of change, and more powerful model is established to distinguish different types of changes. Unlike the traditional methods, the proposed framework integrates the merits of sparse autoencoder and CNN to learn more robust difference representations and the concept of change for ternary change detection. Experimental results on real datasets validate the effectiveness and superiority of the proposed framework. (C) 2017 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS). Published by Elsevier B.V. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available