4.7 Article

Collaborative Attention-Based Heterogeneous Gated Fusion Network for Land Cover Classification

Journal

IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING
Volume 59, Issue 5, Pages 3829-3845

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TGRS.2020.3015389

Keywords

Collaborative attention model; complementary information; convolutional neural networks (CNNs); gated fusion; land cover classification

Funding

  1. National Natural Science Foundation of China [61971426]

Ask authors/readers for more resources

This study compares the complementary effects between optical and SAR features in land cover classification, proposing a novel collaborative attention-based fusion network that hierarchically fuses both types of features. The network utilizes multi-stage feature learning, collaborative attention mechanisms, and a gated fusion module to automatically learn the varying contributions of optical and SAR features, showing advantages over existing methods in land cover classification.
Existing land cover classification methods mostly rely on either the optical or synthetic aperture radar (SAR) features alone, which ignore the mutual complementary effects between optical and SAR sources. In this article, we compare the distribution histograms of deep semantic features extracted from optical and SAR modalities within land cover categories, which intuitively demonstrates that there are the large complementary potentials between the optical and SAR features. Therefore, we propose a novel collaborative attention-based heterogeneous gated fusion network (CHGFNet), which hierarchically fuses both optical and SAR features for land cover classification. More specifically, the CHGFNet consists of three main components: two-stream feature extractor, multimodal collaborative attention module (MCAM), and the gated heterogeneous fusion module (GHFM). Given optical and SAR patch pairs, two-stream feature extractor introduces multistage feature learning methodology to acquire discriminative optical and SAR features. Then, to explore the inherent complementarity between optical and SAR features, MCAM is embedded into CHGFNet, which provides an efficient stage to capture the correlation between optical and SAR features by jointly calculating the collaborative attention in joint feature space. Finally, to automatically learn the varying contributions of both optical and SAR features for classifying different land categories, GHFM is used to fuse both optical and SAR features. Extensive comparative evaluations demonstrate the advantages of CHGFNet within land cover classification over the state-of-the-art methods on three co-registered optical and SAR data sets.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available