3.8 Proceedings Paper

Self-Supervised Convolutional Subspace Clustering Network

Publisher

IEEE COMPUTER SOC
DOI: 10.1109/CVPR.2019.00562

Keywords

-

Funding

  1. National Natural Science Foundation of China (NSFC) [61876022]
  2. Key Laboratory of Machine Perception (MOE), Peking University
  3. NSFC [61701032, 61806184, 61625301, 61731018]
  4. Shenzhen Fundamental Research Fund [ZDSYS201707251409055, 2017ZT07X152]
  5. 973 Program of China [2015CB352502]
  6. Qualcomm
  7. Microsoft Research Asia

Ask authors/readers for more resources

Subspace clustering methods based on data self-expression have become very popular for learning from data that lie in a union of low-dimensional linear subspaces. However, the applicability of subspace clustering has been limited because practical visual data in raw form do not necessarily lie in such linear subspaces. On the other hand, while Convolutional Neural Network (ConvNet) has been demonstrated to be a powerful tool for extracting discriminative features from visual data, training such a ConvNet usually requires a large amount of labeled data, which are unavailable in subspace clustering applications. To achieve simultaneous feature learning and subspace clustering, we propose an end-to-end trainable framework, called Self-Supervised Convolutional Subspace Clustering Network (S-2 ConvSCN), that combines a ConvNet module (for feature learning), a self-expression module (for subspace clustering) and a spectral clustering module (for self-supervision) into a joint optimization framework. Particularly, we introduce a dual self-supervision that exploits the output of spectral clustering to supervise the training of the feature learning module (via a classification loss) and the self-expression module (via a spectral clustering loss). Our experiments on four benchmark datasets show the effectiveness of the dual self-supervision and demonstrate superior performance of our proposed approach.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available