期刊
IEEE GEOSCIENCE AND REMOTE SENSING LETTERS
卷 19, 期 -, 页码 -出版社
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/LGRS.2021.3049251
关键词
Standards; Image resolution; Convolution; Context modeling; Computational modeling; Computational complexity; Training; Deep convolutional neural networks (CNNs); low-rank representation; nonlocal mechanism (NM); remote sensing scene (RSS) classification; supervised learning
类别
资金
- Zhejiang Provincial Natural Science Foundation of China [LQ19D010010]
- Chuzhou University's Research and Development Fund [2020qd45, 2020qd46]
The letter introduces a low-cost and low-space complexity global context acquisition mechanism called low-rank nonlocal representation (LNR), which effectively reduces computational costs and improves performance. Experimental results demonstrate that LNR can significantly reduce model parameters in remote sensing scene classification tasks.
The nonlocal mechanism (NM) has shown its effectiveness in many real-world applications. However, it is usually criticized for its costly computation complexity in time and space, which are both O((HW2)-W-2), where Hx W is the spatial dimension of input feature maps in height and width. In this letter, we first show that the NM can be expressed as a low-rank representation by theory. Then we propose a down-to-earth low-complexity global context acquisition mechanism, termed as the low-rank nonlocal representation (LNR), whose complexity in time and space are both approximatively O(HW). LNR is a general module that can be deployed on an arbitrary convolutional neural network (CNN) hierarchy for any visual recognition tasks. To demonstrate its superiority, experiments are carried out on four standard remote sensing scene classification benchmarks. Experimental results show that our proposed LNR can significantly reduce the computation cost by boosting the performance gain. Implementing LNR on the classical ResNet, in particular, can reduce at most 3.93 M model parameters.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据