4.7 Article

Fast and Accurate Spatiotemporal Fusion Based Upon Extreme Learning Machine

Journal

IEEE GEOSCIENCE AND REMOTE SENSING LETTERS
Volume 13, Issue 12, Pages 2039-2043

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/LGRS.2016.2622726

Keywords

Extreme learning machine (ELM); feature representation; local structural information; mapping function; spatiotemporal image fusion

Funding

  1. National Natural Science Foundation of China [61301090]

Ask authors/readers for more resources

Spatiotemporal fusion is important in providing high spatial resolution earth observations with a dense time series, and recently, learning-based fusion methods have been attracting broad interest. These algorithms project image patches onto a feature space with the enforcement of a simple mapping to predict the fine resolution patches from the corresponding coarse ones. However, the sophisticated projection, e.g., sparse representation, is always computationally complex and difficult to be implemented on large patches, which cannot grasp enough local structural information in the coarse patches. To address these issues, a novel spatiotemporal fusion method is proposed in this letter, using a powerful learning technique, i.e., extreme learning machine (ELM). Unlike traditional approaches, we devote to learning a mapping function on difference images directly, rather than the sophisticated feature representation followed by a simple mapping. Characterized by good generalization performance and fast speed, the ELM is employed to achieve accurate and fast fine patches prediction. The proposed algorithm is evaluated by five actual data sets of Landsat enhanced thematic mapper plus-moderate resolution imaging spectroradiometer acquisitions and experimental results show that our method obtains better fusion results while achieving much greater speed.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available