4.6 Article

Contrastive Learning via Local Activity

期刊

ELECTRONICS
卷 12, 期 1, 页码 -

出版社

MDPI
DOI: 10.3390/electronics12010147

关键词

unsupervised; representation learning; non-backpropagation

向作者/读者索取更多资源

In this paper, the authors propose a method called local activity contrast (LAC) for unsupervised pretraining in deep learning. LAC uses two forward passes and a locally defined loss function to learn meaningful representations, overcoming the complexity of current contrastive learning methods. The authors demonstrate that LAC can be a useful pretraining method, and it exhibits competitive performance in various downstream tasks compared to other unsupervised learning methods.
Contrastive learning (CL) helps deep networks discriminate between positive and negative pairs in learning. As a powerful unsupervised pretraining method, CL has greatly reduced the performance gap with supervised training. However, current CL approaches mainly rely on sophisticated augmentations, a large number of negative pairs and chained gradient calculations, which are complex to use. To address these issues, in this paper, we propose the local activity contrast (LAC) algorithm, which is an unsupervised method based on two forward passes and locally defined loss to learn meaningful representations. The learning target of each layer is to minimize the activation value difference between two forward passes, effectively overcoming the limitations of applying CL above mentioned. We demonstrated that LAC could be a very useful pretraining method using reconstruction as the pretext task. Moreover, through pretraining with LAC, the networks exhibited competitive performance in various downstream tasks compared with other unsupervised learning methods.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据