3.8 Proceedings Paper

Learning a Self-Expressive Network for Subspace Clustering

Publisher

IEEE COMPUTER SOC
DOI: 10.1109/CVPR46437.2021.01221

Keywords

-

Funding

  1. National Natural Science Foundation of China [61876022]
  2. Northrop Grumman Mission Systems Research in Applications for Learning Machines
  3. NSF [1704458, 2031985, 1934979]
  4. Tsinghua-Berkeley Shenzhen Institute Research Fund
  5. Direct For Computer & Info Scie & Enginr
  6. Division of Computing and Communication Foundations [1934979] Funding Source: National Science Foundation
  7. Direct For Mathematical & Physical Scien
  8. Division Of Mathematical Sciences [2031985] Funding Source: National Science Foundation

Ask authors/readers for more resources

In this paper, a novel subspace clustering framework SENet is proposed, which can learn self-expressive coefficients and handle out-of-sample data, as well as perform subspace clustering on large-scale datasets. Extensive experiments demonstrate the effectiveness of SENet on various benchmark datasets.
State-of-the-art subspace clustering methods are based on the self-expressive model, which represents each data point as a linear combination of other data points. However, such methods are designed for a finite sample dataset and lack the ability to generalize to out-of-sample data. Moreover, since the number of self-expressive coefficients grows quadratically with the number of data points, their ability to handle large-scale datasets is often limited. In this paper, we propose a novel framework for subspace clustering, termed Self-Expressive Network (SENet), which employs a properly designed neural network to learn a self-expressive representation of the data. We show that our SENet can not only learn the self-expressive coefficients with desired properties on the training data, but also handle out-of-sample data. Besides, we show that SENet can also be leveraged to perform subspace clustering on large-scale datasets. Extensive experiments conducted on synthetic data and real world benchmark data validate the effectiveness of the proposed method. In particular, SENet yields highly competitive performance on MNIST, Fashion MNIST and Extended MNIST and state-of-the-art performance on CIFAR-10.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available