4.5 Article

Channel-Wise Correlation Calibrates Attention Module for Convolutional Neural Networks

期刊

JOURNAL OF SENSORS
卷 2022, 期 -, 页码 -

出版社

HINDAWI LTD
DOI: 10.1155/2022/2000170

关键词

-

资金

  1. Science and Technology Project of State Grid Shanxi Electric Power Company
  2. [52051K21000B]

向作者/读者索取更多资源

This study introduces a new channel attention module LCM, which optimizes the correlation between channel features by integrating global information and channel dependence, showing superiority in experiments.
It is well known in image recognition that global features represent the overall and have the ability to generalize an entire object, while local features can reflect the details, both of which are important for extracting more discriminative features. Recent research has shown that the performance of convolutional neural networks can be improved by introducing an attention module. In this paper, we propose a simple and effective channel attention module named layer feature that meets channel attention module (LC module, LCM), which combines the layer global information with channel dependence to calibrate the correlation between channel features and then adaptively recalibrates channel-wise feature responses. Compared with the traditional channel attention methods, the LC module utilizes the most significant information that needs to be focused on in the overall features to refine the channel relationship. Through empirical studies on CIFAR-10, CIFAR-100, and mini-ImageNet, this work proved its superiority compared to other attention modules in different DCNNs. Furthermore, we performed the two-dimensional visualization of the feature map through the class activation map and intuitively analyzed the effectiveness of the model.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据