3.8 Proceedings Paper

Graph Contrastive Clustering

Publisher

IEEE
DOI: 10.1109/ICCV48922.2021.00909

Keywords

-

Funding

  1. Key-Area Research and Development (R&D) Program of Guangdong Province [2019B121204008]
  2. National Natural Science Foundation (NSF) of China [62006140, 61625301, 61731018]
  3. Key R&D Program of Shandong Province (Major scientific and technological innovation projects) [2020CXGC010111]
  4. NSF of Shandong Province [ZR2020QF106]
  5. PKU-Baidu Fund [2020BD006]
  6. Major Scientific Research Project of Zhejiang Lab [2019KB0AC01, 2019KB0AB02]
  7. Beijing Academy of Artificial Intelligence

Ask authors/readers for more resources

A novel graph contrastive learning framework, applied to clustering task as the Graph Contrastive Clustering (GCC) method, improves clustering performance by lifting instance-level consistency to cluster-level consistency. The method utilizes graph Laplacian based contrastive loss and a novel graph-based contrastive learning strategy to enhance the discriminative features and compact clustering assignments. Experiments show the superiority of the proposed approach over state-of-the-art methods on six commonly used datasets.
Recently, some contrastive learning methods have been proposed to simultaneously learn representations and clustering assignments, achieving significant improvements. However, these methods do not take the category information and clustering objective into consideration, thus the learned representations are not optimal for clustering and the performance might be limited. Towards this issue, we first propose a novel graph contrastive learning framework, and then apply it to the clustering task, resulting in the Graph Constrastive Clustering (GCC) method. Different from basic contrastive clustering that only assumes an image and its augmentation should share similar representation and clustering assignments, we lift the instance-level consistency to the cluster-level consistency with the assumption that samples in one cluster and their augmentations should all be similar. Specifically, on the one hand, we propose the graph Laplacian based contrastive loss to learn more discriminative and clustering-friendly features. On the other hand, we propose a novel graph-based contrastive learning strategy to learn more compact clustering assignments. Both of them incorporate the latent category information to reduce the intra-cluster variance as well as increase the inter-cluster variance. Experiments on six commonly used datasets demonstrate the superiority of our proposed approach over the state-of-the-art methods.(1)

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available