4.6 Article

One-Step Adaptive Spectral Clustering Networks

Journal

IEEE SIGNAL PROCESSING LETTERS
Volume 29, Issue -, Pages 2263-2267

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/LSP.2022.3217441

Keywords

Adaptation models; Linear programming; Affinity matrix learning; deep spectral clustering; spectral rotation

Funding

  1. National Natural Science Foundation of China [62006131, 62071260]
  2. National Natural Science Foundation of Zhejiang Province [LZ22F020001, LQ21F020009, LQ18F020001]
  3. K. C.Wong Magna Fund of Ningbo University

Ask authors/readers for more resources

In this letter, the authors propose a one-step adaptive spectral clustering network that combines affinity matrix learning, spectral embedding learning, and indicator learning into a unified framework. Experimental results demonstrate the effectiveness of the proposed method in achieving improved clustering performance on multiple real datasets.
Deep spectral clustering is a popular and efficient algorithm in unsupervised learning. However, deep spectral clustering methods are organized into three separate steps: affinity matrix learning, spectral embedding learning, and K-means clustering on spectral embedding. In this case, although each step can achieve its own performance, it is still difficult to obtain robust clustering results. In this letter, we propose a one-step adaptive spectral clustering network to overcome the aforementioned shortcomings. The network embeds the three parts of affinity matrix learning, spectral embedding learning, and indicator learning into a unified framework. The affinity matrix is adaptively adjusted by spectral embedding in a deep subspace. We introduce spectral rotation to discretize spectral embedding, which makes the spectral embedding and indicator be learned simultaneously to improve clustering quality. Each part of the model can be iteratively updated based on other parts to optimize the clustering results. Experimental results on four real datasets show the effectiveness of our method on the ACC and NMI clustering evaluation metrics. In particular, our method achieves an NMI of 0.932 and an ACC of 0.973 on the MNIST dataset, a decent performance boost compared to the best baseline.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available