4.7 Article

Low-Rank Nonlocal Representation for Remote Sensing Scene Classification

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/LGRS.2021.3049251

Keywords

Standards; Image resolution; Convolution; Context modeling; Computational modeling; Computational complexity; Training; Deep convolutional neural networks (CNNs); low-rank representation; nonlocal mechanism (NM); remote sensing scene (RSS) classification; supervised learning

Funding

  1. Zhejiang Provincial Natural Science Foundation of China [LQ19D010010]
  2. Chuzhou University's Research and Development Fund [2020qd45, 2020qd46]

Ask authors/readers for more resources

The letter introduces a low-cost and low-space complexity global context acquisition mechanism called low-rank nonlocal representation (LNR), which effectively reduces computational costs and improves performance. Experimental results demonstrate that LNR can significantly reduce model parameters in remote sensing scene classification tasks.
The nonlocal mechanism (NM) has shown its effectiveness in many real-world applications. However, it is usually criticized for its costly computation complexity in time and space, which are both O((HW2)-W-2), where Hx W is the spatial dimension of input feature maps in height and width. In this letter, we first show that the NM can be expressed as a low-rank representation by theory. Then we propose a down-to-earth low-complexity global context acquisition mechanism, termed as the low-rank nonlocal representation (LNR), whose complexity in time and space are both approximatively O(HW). LNR is a general module that can be deployed on an arbitrary convolutional neural network (CNN) hierarchy for any visual recognition tasks. To demonstrate its superiority, experiments are carried out on four standard remote sensing scene classification benchmarks. Experimental results show that our proposed LNR can significantly reduce the computation cost by boosting the performance gain. Implementing LNR on the classical ResNet, in particular, can reduce at most 3.93 M model parameters.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available