4.6 Article

Multi-scale hierarchical recurrent neural networks for hyperspectral image classification

Journal

NEUROCOMPUTING
Volume 294, Issue -, Pages 82-93

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2018.03.012

Keywords

Hyperspectral image classification; Recurrent neural networks; Multi-scale

Funding

  1. Research Committee of the University of Macau [MYRG2015-00011-FST, MYRG2015-00012-FST]
  2. Science and Technology Development Fund of Macau SAR [041/2017/A1, 093/2014/A2]

Ask authors/readers for more resources

This paper presents a novel hyperspectral image (HSI) classification framework by exploiting multi-scale spectral-spatial features via hierarchical recurrent neural networks. The neighborhood information plays an important role in the image classification process. Convolutional neural networks (CNNs) have been shown to be effective in learning the local features of HSI. However, CNNs do not consider the spatial dependency of non-adjacent image patches. Recurrent neural networks (RNNs) can effectively establish the relationship of non-adjacent image patches, but it can only be applied to single-dimensional (1D) sequence. In this paper, we propose multi-scale hierarchical recurrent neural networks (MHRNNs) to learn the spatial dependency of non-adjacent image patches in the two-dimension (2D) spatial domain. First, to better represent the objects with different scales, we generate multi-scale 3D image patches of central pixel and surrounding pixels. Then, 3D CNNs extract the local spectral-spatial feature from each 3D image patch, respectively. Finally, multi-scale 1D sequences in eight directions are constructed on the 3D local feature domain, and MHRNNs are proposed to capture the spatial dependency of local spectralspatial features at different scales. The proposed method not only considers the local spectral-spatial features of the HSI, but also captures the spatial dependency of non-adjacent image patches at different scales. Experiments are performed on three real HSI datasets. The results demonstrate the superiority of the proposed method over several state-of-the-art methods in both visual appearance and classification accuracy. (C) 2018 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available