4.3 Article

TrmGLU-Net: transformer-augmented global-local U-Net for hyperspectral image classification with limited training samples

期刊

EUROPEAN JOURNAL OF REMOTE SENSING
卷 56, 期 1, 页码 -

出版社

TAYLOR & FRANCIS LTD
DOI: 10.1080/22797254.2023.2227993

关键词

Hyperspectral image classification; deep learning; convolutional neural network; self-attention; U-Net; superpixel segmentation

向作者/读者索取更多资源

This paper proposes a global-local U-Net augmented by transformers (TrmGLU-Net) for the classification of hyperspectral images. The model captures long-distance dependencies in both spatial and spectral dimensions and performs well under the condition of small samples.
In recent years, deep learning methods have been widely used for the classification of hyperspectral images. However, their limited availability under the condition of small samples remains a serious issue. Moreover, the current mainstream approaches based on convolutional neural networks do well in local feature extraction but are also restricted by its limited receptive field. Hence, these models are unable to capture long-distance dependencies both on spatial and spectral dimension. To address above issues, this paper proposes a global-local U-Net augmented by transformers (TrmGLU-Net). First, whole hyperspectral images are input to the model for end-to-end training to capture the contextual information. Then, a transformer-augmented U-Net is designed with alternating transformers and convolutional layers to perceive both global and local information. Finally, a superpixel-based label expansion method is proposed to expand the labels and improve the performance under the condition of small samples. Extensive experiments on four hyperspectral scenes demonstrate that TrmGLUNet has better performance than other advanced patch-level and image-level methods with limited training samples. The relevant code will be opened at https://github.com/sssssyf/ TrmGLU-Net

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.3
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据