Journal
IEEE GEOSCIENCE AND REMOTE SENSING LETTERS
Volume 19, Issue -, Pages -Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/LGRS.2021.3049251
Keywords
Standards; Image resolution; Convolution; Context modeling; Computational modeling; Computational complexity; Training; Deep convolutional neural networks (CNNs); low-rank representation; nonlocal mechanism (NM); remote sensing scene (RSS) classification; supervised learning
Categories
Funding
- Zhejiang Provincial Natural Science Foundation of China [LQ19D010010]
- Chuzhou University's Research and Development Fund [2020qd45, 2020qd46]
Ask authors/readers for more resources
The letter introduces a low-cost and low-space complexity global context acquisition mechanism called low-rank nonlocal representation (LNR), which effectively reduces computational costs and improves performance. Experimental results demonstrate that LNR can significantly reduce model parameters in remote sensing scene classification tasks.
The nonlocal mechanism (NM) has shown its effectiveness in many real-world applications. However, it is usually criticized for its costly computation complexity in time and space, which are both O((HW2)-W-2), where Hx W is the spatial dimension of input feature maps in height and width. In this letter, we first show that the NM can be expressed as a low-rank representation by theory. Then we propose a down-to-earth low-complexity global context acquisition mechanism, termed as the low-rank nonlocal representation (LNR), whose complexity in time and space are both approximatively O(HW). LNR is a general module that can be deployed on an arbitrary convolutional neural network (CNN) hierarchy for any visual recognition tasks. To demonstrate its superiority, experiments are carried out on four standard remote sensing scene classification benchmarks. Experimental results show that our proposed LNR can significantly reduce the computation cost by boosting the performance gain. Implementing LNR on the classical ResNet, in particular, can reduce at most 3.93 M model parameters.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available