4.6 Article

One-Step Adaptive Spectral Clustering Networks

期刊

IEEE SIGNAL PROCESSING LETTERS
卷 29, 期 -, 页码 2263-2267

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/LSP.2022.3217441

关键词

Adaptation models; Linear programming; Affinity matrix learning; deep spectral clustering; spectral rotation

资金

  1. National Natural Science Foundation of China [62006131, 62071260]
  2. National Natural Science Foundation of Zhejiang Province [LZ22F020001, LQ21F020009, LQ18F020001]
  3. K. C.Wong Magna Fund of Ningbo University

向作者/读者索取更多资源

In this letter, the authors propose a one-step adaptive spectral clustering network that combines affinity matrix learning, spectral embedding learning, and indicator learning into a unified framework. Experimental results demonstrate the effectiveness of the proposed method in achieving improved clustering performance on multiple real datasets.
Deep spectral clustering is a popular and efficient algorithm in unsupervised learning. However, deep spectral clustering methods are organized into three separate steps: affinity matrix learning, spectral embedding learning, and K-means clustering on spectral embedding. In this case, although each step can achieve its own performance, it is still difficult to obtain robust clustering results. In this letter, we propose a one-step adaptive spectral clustering network to overcome the aforementioned shortcomings. The network embeds the three parts of affinity matrix learning, spectral embedding learning, and indicator learning into a unified framework. The affinity matrix is adaptively adjusted by spectral embedding in a deep subspace. We introduce spectral rotation to discretize spectral embedding, which makes the spectral embedding and indicator be learned simultaneously to improve clustering quality. Each part of the model can be iteratively updated based on other parts to optimize the clustering results. Experimental results on four real datasets show the effectiveness of our method on the ACC and NMI clustering evaluation metrics. In particular, our method achieves an NMI of 0.932 and an ACC of 0.973 on the MNIST dataset, a decent performance boost compared to the best baseline.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据