4.7 Article

Spatio-Temporal-Spectral Collaborative Learning for Spatio-Temporal Fusion with Land Cover Changes

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TGRS.2022.3185459

Keywords

Feature extraction; Remote sensing; Spatial resolution; Sensors; MODIS; Adaptive systems; Earth; Adaptive weighting fusion; convolutional neural network (CNN); land cover changes; recurrent neural network (RNN); spatio-temporal fusion

Funding

  1. National Natural Science Foundation of China [42171326, 62071261, 41801252]
  2. Zhejiang Provincial Natural Science Foundation of China [LY22F010014]
  3. Postdoctoral Research Foundation of China [2020M672490]

Ask authors/readers for more resources

This article proposes a spatio-temporal-spectral collaborative learning framework for spatio-temporal fusion of multisource remote-sensing images. The framework integrates convolutional neural network and recurrent neural network to learn and fuse features from multiscale spatial-spectral to spatio-temporal. Experimental results demonstrate the competitive performance of the proposed method in fusing images with land cover changes.
Spatio-temporal fusion by combining the complementary spatial and temporal advantages of multisource remote-sensing images to obtain time-series high spatial resolution images is highly desirable in monitoring surface dynamics. Currently, deep learning (DL)-based fusion methods have received extensive attention. However, existing DL-based spatio-temporal fusion methods are generally limited in fusing the images with land cover changes. In this article, we propose a spatio-temporal-spectral collaborative learning framework for spatio-temporal fusion to alleviate this problem. Specifically, the proposed method integrates the convolutional neural network and recurrent neural network into a unified framework, consisting of three subnetworks: the multiscale Siamese convolutional neural network (MSCNN), the multilayer convolutional recurrent neural network, and the adaptive weighting fusion network (AWFNet). The MSCNN has a flexible weight-sharing network to extract multiscale spatial-spectral features from multisource remote-sensing images. The multilayer convolutional recurrent neural network is constructed on the convolutional long short-term memory units to comprehensively learn the land cover changes by spatial, spectral, and temporal joint features. The AWFNet with a spatio-temporal-spectral change (STSC) loss is proposed to further improve the interpretability and robustness. The experiments were performed on the publicly available benchmark datasets featured phenology and land cover type changes, respectively. The experimental results demonstrated the competitive performance of the proposed method than other state-of-the-art fusion methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available