4.7 Article

Sparse and low-rank regularized deep subspace clustering

期刊

KNOWLEDGE-BASED SYSTEMS
卷 204, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.knosys.2020.106199

关键词

Subspace clustering; Self-expressive matrix; Low-rank; Deep neural network

资金

  1. Natural Science Foundation of Zhejiang Province, China [LQ20F030015]

向作者/读者索取更多资源

Subspace clustering aims at discovering the intrinsic structure of data in unsupervised fashion. As ever in most of approaches, an affinity matrix is constructed by learning from original data or the corresponding hand-crafted feature with some constraints on the self-expressive matrix (SEM), which is then followed by spectral clustering algorithm. Based on successful applications of deep technologies, it has become popular to simultaneously accomplish deep feature and self-representation learning for subspace clustering. However, deep feature and SEM in previous deep methods are lack of precise constraints, which is sub-optimal to conform with the linear subspace model. To address this, we propose an approach, namely sparse and low-rank regularized deep subspace clustering (SLR-DSC). In the proposed SLR-DSC, an end-to-end framework is proposed by introducing sparse and low-rank constraints on deep feature and SEM respectively. The sparse deep feature and low-rank regularized SEM implemented via fully-connected layers are encouraged to facilitate a more informative affinity matrix. In order to solve the nuclear norm minimization problem, a sub-gradient computation strategy is utilized to cater to the chain rule. Experiments on the data sets demonstrate that our method significantly outperforms the competitive unsupervised subspace clustering approaches. (C) 2020 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据