4.7 Article

Deep Collaborative Attention Network for Hyperspectral Image Classification by Combining 2-D CNN and 3-D CNN

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/JSTARS.2020.3016739

Keywords

Feature extraction; Data mining; Correlation; Hyperspectral imaging; Solid modeling; Principal component analysis; Convolutional neural network (CNN); feature extraction; hyperspectral image classification; multilayer feature fusion; spatial attention mechanism

Funding

  1. National Natural Science Foundation of China [61601201]
  2. Natural Science Foundation of Jiangsu Province [BK20160188]

Ask authors/readers for more resources

Deep learning-based methods based on convolutional neural networks (CNNs) have demonstrated remarkable performance in hyperspectral image (HSI) classification. Most of these approaches are only based on 2-D CNN or 3-D CNN. It is dramatic from the literature that using just 2-D CNN may result in missing channel relationship information, and using just 3-D CNN may make the model very complex. Moreover, the existing network models do not pay enough attention to extracting spectral-spatial correlation information. To address these issues, we propose a deep collaborative attention network for HSI classification by combining 2-D CNN, and 3-D CNN (CACNN). Specifically, we first extract spectral-spatial features by using 2-D CNN, and 3-D CNN, respectively, and then use a NonLocalBlock to combine these two kinds of features. This block serves as a typical spatial attention mechanism, and makes salient features be emphasized. Then, we propose a Conv_Block that is similar to the lightweight dense block to extract correlation information contained in the feature maps. Finally, we consider a deep multilayer feature fusion strategy, and thereby combine the features of different hierarchical layers to extract the strong correlated spectral-spatial information among them. To test the performance of CACNN approach, several experiments are performed on four well-known HSIs. The results are compared with the state-of-the-art approaches, and satisfactory performance is obtained by our proposed method. The code of CACNN method is available on Dr. J. Liu's GitHub. (1) (1) Online. [Avaialble]: https://github.com/liuofficial.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available