Journal
KNOWLEDGE-BASED SYSTEMS
Volume 260, Issue -, Pages -Publisher
ELSEVIER
DOI: 10.1016/j.knosys.2022.110121
Keywords
Self -supervised learning; Contrastive learning; Hard negatives; Student -t distribution; Neighbor consistency constraint
Categories
Ask authors/readers for more resources
Contrastive learning, as a self-supervised method, has achieved significant success by learning latent semantic class information. However, treating instances as classes hinders the learning of true latent semantic classes due to the presence of hard negatives. To address this, we propose a new contrastive learning framework called TNCC, which uses a loss based on Student-t distribution and incorporates a neighbor consistency constraint. Experimental results on benchmark datasets demonstrate the effectiveness of the proposed framework.
Contrastive learning as a self-supervised method has achieved great success. Although it is an instancelevel discriminative method, the model can eventually learn latent semantic class information. Its core idea is pulling different views of the same instance closer but pushing out different instances. However, treating an instance as a class hinders the model from learning true latent semantic classes, which is caused by instances (called hard negatives) that are similar to the anchor but do not belong to the same semantic class. In this paper, we propose a new contrastive learning framework based on the Student-t distribution with a neighbor consistency constraint (TNCC) to reduce the effect of hard negatives. In this framework, we propose to use the loss based on the Student-t distribution as the instance-level discriminative loss to keep hard negatives far away. Furthermore, we add a new neighbor consistency constraint to maintain consistency within the semantic classes. Finally, we compare TNCC with recent state-of-the-art contrastive learning methods on five benchmark datasets to verify the effectiveness of the proposed framework.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available