4.3 Article

DSS-TRM: deep spatial-spectral transformer for hyperspectral image classification

期刊

EUROPEAN JOURNAL OF REMOTE SENSING
卷 55, 期 1, 页码 103-114

出版社

TAYLOR & FRANCIS LTD
DOI: 10.1080/22797254.2021.2023910

关键词

Hyperspectral image classification; self attention; spatial-spectral transformer; deep learning

资金

  1. National Natural Science Foundation of China [41801388]

向作者/读者索取更多资源

In this paper, a deep transformer with self-attention mechanism is proposed for hyperspectral image classification. By exploring the spectral dimension and spatial dimension separately, a deep spatial-spectral transformer (DSS-TRM) is introduced to improve the classification performance. Experimental results demonstrate that the proposed DSS-TRM outperforms traditional convolutional neural networks and attention based methods.
In recent years, the wide use of deep learning based methods has greatly improved the classification performance of hyperspectral image (HSI). As an effective method to improve the performance of deep convolution networks, attention mechanism is also widely used for HSI classification tasks. However, the majority of the existing attention mechanisms for HSI classification are based on the convolution layer, and the classification accuracy still has margins for improvement. Motivated by the latest self attention mechanism in natural language processing, a deep transformer is proposed for HSI classification in this paper. Specifically, deep transformer along the spectral dimension and the spatial dimension are explored respectively. Then, a deep spatial-spectral transformer (DSS-TRM) is proposed to improve the classification performance of HSI. The contribution of this paper is to make full use of self attention mechanism, that is to use transformer layer instead of convolution layer. More importantly, a DSS-TRM is proposed to realize end-to-end HSI classification. Extensive experiments are conducted on three HSI data sets. The experimental results demonstrates that the proposed DSS-TRM could outperform the traditional convolutional neural networks and attention based methods.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.3
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据