4.7 Article

StfMLP: Spatiotemporal Fusion Multilayer Perceptron for Remote-Sensing Images

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/LGRS.2022.3230720

Keywords

Data fusion; multilayer perceptron (MLP); remote-sensing (RS) images; spatiotemporal fusion multilayer perceptron (StfMLP); transductive learning

Ask authors/readers for more resources

In this study, a deep-learning-based method called spatiotemporal fusion multilayer perceptron (StfMLP) is proposed to achieve more accurate remote-sensing image fusion with a small-scale of data. Experimental results demonstrate that the proposed method outperforms the state-of-the-art methods effectively on two public datasets.
Remote-sensing (RS) images with high spatial and temporal resolutions play a significant role in monitoring periodic landscape changes for earth observation science. To enrich RS images, spatiotemporal fusion (STF) is considered a promising approach. The key challenge in the current STF-based methods is the requirement for large-scale data. In this work, we propose a deep-learning-based method called spatiotemporal fusion multilayer perceptron (StfMLP) to tackle this challenge. First, our method focuses on the given data in the manner of transductive learning. Second, we propose a designed multilayer perceptron (MLP) model to capture the time dependency and consistency among the input images. Consequently, StfMLP is capable of simultaneously achieving more accurate fusion and requiring a small-scale of data. We conduct extensive experiments on two widely adopted public datasets, namely Coleambally irrigation area (CIA) and the lower Gwydir catchment (LGC). The experimental results demonstrate that the proposed method outperforms the state-of-the-art methods effectively. Code, trained model, and cropped images are available online (https://github.com/luhailaing-max/StfMLP-master).

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available