期刊
KNOWLEDGE-BASED SYSTEMS
卷 260, 期 -, 页码 -出版社
ELSEVIER
DOI: 10.1016/j.knosys.2022.110121
关键词
Self -supervised learning; Contrastive learning; Hard negatives; Student -t distribution; Neighbor consistency constraint
Contrastive learning, as a self-supervised method, has achieved significant success by learning latent semantic class information. However, treating instances as classes hinders the learning of true latent semantic classes due to the presence of hard negatives. To address this, we propose a new contrastive learning framework called TNCC, which uses a loss based on Student-t distribution and incorporates a neighbor consistency constraint. Experimental results on benchmark datasets demonstrate the effectiveness of the proposed framework.
Contrastive learning as a self-supervised method has achieved great success. Although it is an instancelevel discriminative method, the model can eventually learn latent semantic class information. Its core idea is pulling different views of the same instance closer but pushing out different instances. However, treating an instance as a class hinders the model from learning true latent semantic classes, which is caused by instances (called hard negatives) that are similar to the anchor but do not belong to the same semantic class. In this paper, we propose a new contrastive learning framework based on the Student-t distribution with a neighbor consistency constraint (TNCC) to reduce the effect of hard negatives. In this framework, we propose to use the loss based on the Student-t distribution as the instance-level discriminative loss to keep hard negatives far away. Furthermore, we add a new neighbor consistency constraint to maintain consistency within the semantic classes. Finally, we compare TNCC with recent state-of-the-art contrastive learning methods on five benchmark datasets to verify the effectiveness of the proposed framework.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据