4.6 Article

Directionally separable dilated CNN with hierarchical attention feature fusion for hyperspectral image classification

期刊

INTERNATIONAL JOURNAL OF REMOTE SENSING
卷 43, 期 3, 页码 812-840

出版社

TAYLOR & FRANCIS LTD
DOI: 10.1080/01431161.2021.2019849

关键词

hyperspectral image classification; dilated convolution; hierarchical feature fusion

资金

  1. National Natural Science Foundation of China [62071168]
  2. Natural Science Foundation of Jiangsu Province [BK20211201]
  3. Fundamental Research Funds for the Central Universities [B200202183]
  4. China Postdoctoral Science Foundation [2021M690885]

向作者/读者索取更多资源

This paper proposes a lightweight convolutional neural network structure called DSD-HAFF, which improves hyperspectral image classification performance by constructing global dense dilated CNN branches and a hierarchical attention feature fusion branch. The structure can fully incorporate hierarchical features and significantly reduce network parameters.
In recent years, the convolutional neural network (CNN) plays a vital role in hyperspectral image classification and performs more competitively than many other methods. However, in order to pursue better performance, most of existing CNN-based methods just simply stack rather deep convolutional layers. Although they improve the classification accuracy to a certain extent, they result in plenty of network parameters. In this paper, a light-weighted directionally separable dilated CNN with hierarchical attention feature fusion (DSD-HAFF) is proposed to solve these problems. First, two global dense dilated CNN branches that focus on two spatial directions separately are constructed to extract and reuse spatial information as much as possible. Second, a hierarchical attention feature fusion branch that consists of several coordinate attention blocks (CABs) is constructed. Hierarchical features from two directionally separable dilated CNN branches are adopted as inputs of CABs. In this way, the structure can not only fully incorporate hierarchical features, but also significantly reduce the network parameters. Meanwhile, the hierarchical attention feature fusion branch incorporates features from high-level to low-level in the kernel-number pyramid strategy. Experimental results on three popular benchmark datasets demonstrate that the DSD-HAFF achieves better performance and has a much smaller number of network parameters than the other state-of-the-art methods.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据