4.6 Article

Directionally separable dilated CNN with hierarchical attention feature fusion for hyperspectral image classification

Journal

INTERNATIONAL JOURNAL OF REMOTE SENSING
Volume 43, Issue 3, Pages 812-840

Publisher

TAYLOR & FRANCIS LTD
DOI: 10.1080/01431161.2021.2019849

Keywords

hyperspectral image classification; dilated convolution; hierarchical feature fusion

Funding

  1. National Natural Science Foundation of China [62071168]
  2. Natural Science Foundation of Jiangsu Province [BK20211201]
  3. Fundamental Research Funds for the Central Universities [B200202183]
  4. China Postdoctoral Science Foundation [2021M690885]

Ask authors/readers for more resources

This paper proposes a lightweight convolutional neural network structure called DSD-HAFF, which improves hyperspectral image classification performance by constructing global dense dilated CNN branches and a hierarchical attention feature fusion branch. The structure can fully incorporate hierarchical features and significantly reduce network parameters.
In recent years, the convolutional neural network (CNN) plays a vital role in hyperspectral image classification and performs more competitively than many other methods. However, in order to pursue better performance, most of existing CNN-based methods just simply stack rather deep convolutional layers. Although they improve the classification accuracy to a certain extent, they result in plenty of network parameters. In this paper, a light-weighted directionally separable dilated CNN with hierarchical attention feature fusion (DSD-HAFF) is proposed to solve these problems. First, two global dense dilated CNN branches that focus on two spatial directions separately are constructed to extract and reuse spatial information as much as possible. Second, a hierarchical attention feature fusion branch that consists of several coordinate attention blocks (CABs) is constructed. Hierarchical features from two directionally separable dilated CNN branches are adopted as inputs of CABs. In this way, the structure can not only fully incorporate hierarchical features, but also significantly reduce the network parameters. Meanwhile, the hierarchical attention feature fusion branch incorporates features from high-level to low-level in the kernel-number pyramid strategy. Experimental results on three popular benchmark datasets demonstrate that the DSD-HAFF achieves better performance and has a much smaller number of network parameters than the other state-of-the-art methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available