4.8 Article

CaCo: Both Positive and Negative Samples are Directly Learnable via Cooperative-Adversarial Contrastive Learning

出版社

IEEE COMPUTER SOC
DOI: 10.1109/TPAMI.2023.3262608

关键词

Contrastive learning; cooperative-adversarial learning; self-supervised learning; positive plus negative samples

向作者/读者索取更多资源

Contrastive learning, as a representative self-supervised method, has achieved great success in unsupervised representation training. It trains an encoder by distinguishing positive and negative samples. We propose a principled method that directly learns these samples with the encoder through cooperative and adversarial learning.
As a representative self-supervised method, contrastive learning has achieved great successes in unsupervised training of representations. It trains an encoder by distinguishing positive samples from negative ones given query anchors. These positive and negative samples play critical roles in defining the objective to learn the discriminative encoder, avoiding it from learning trivial features. While existing methods heuristically choose these samples, we present a principled method where both positive and negative samples are directly learnable end-to-end with the encoder. We show that the positive and negative samples can be cooperatively and adversarially learned by minimizing and maximizing the contrastive loss, respectively. This yields cooperative positives and adversarial negatives with respect to the encoder, which are updated to continuously track the learned representation of the query anchors over mini-batches. The proposed method achieves 71.3% and 75.3% in top-1 accuracy respectively over 200 and 800 epochs of pre-training ResNet-50 backbone on ImageNet1K without tricks such as multi-crop or stronger augmentations. With Multi-Crop, it can be further boosted into 75.7%. The source code and pre-trained model are released in https://github.com/maple-research-lab/caco.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据