4.7 Article

A Multiscale Cross Interaction Attention Network for Hyperspectral Image Classification

Journal

REMOTE SENSING
Volume 15, Issue 2, Pages -

Publisher

MDPI
DOI: 10.3390/rs15020428

Keywords

hyperspectral image classification; interaction attention; multiscale cross; convolutional neural network

Ask authors/readers for more resources

This article proposes a multiscale cross interaction attention network (MCIANet) for hyperspectral image classification. It highlights the distinguishability of HSI and dispels redundant information through an interaction attention module (IAM), and detects spectral-spatial features at different scales, convolutional layers, and branches through a multiscale cross feature extraction module (MCFEM). The results show that the presented method outperforms state-of-the-art methods.
Convolutional neural networks (CNNs) have demonstrated impressive performance and have been broadly applied in hyperspectral image (HSI) classification. However, two challenging problems still exist: the first challenge is that redundant information is averse to feature learning, which damages the classification performance; the second challenge is that most of the existing classification methods only focus on single-scale feature extraction, resulting in underutilization of information. To resolve the two preceding issues, this article proposes a multiscale cross interaction attention network (MCIANet) for HSI classification. First, an interaction attention module (IAM) is designed to highlight the distinguishability of HSI and dispel redundant information. Then, a multiscale cross feature extraction module (MCFEM) is constructed to detect spectral-spatial features at different scales, convolutional layers, and branches, which can increase the diversity of spectral-spatial features. Finally, we introduce global average pooling to compress multiscale spectral-spatial features and utilize two fully connection layers, two dropout layers to obtain the output classification results. Massive experiments on three benchmark datasets demonstrate the superiority of our presented method compared with the state-of-the-art methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available