期刊
NEURAL NETWORKS
卷 35, 期 -, 页码 21-30出版社
PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neunet.2012.06.007
关键词
Hierarchical clustering; Spectral clustering; Kernel methods; Out-of-sample extensions
资金
- Research Council KUL [GOA/11/05, GOA/10/09, CoE EF/05/006, IOF-SCORES4CHEM]
- Flemish Government:FWO [G0226.06, G0321.06, G.0302.07, G.0320.08, G.0558.08, G.0557.08, G.0588.09, G.0377.12, G.0377.09]
- Belgian Federal Science Policy Office [IUAP P6/04]
Kernel spectral clustering fits in a constrained optimization framework where the primal problem is expressed in terms of high-dimensional feature maps and the dual problem is expressed in terms of kernel evaluations. An eigenvalue problem is solved at the training stage and projections onto the eigenvectors constitute the clustering model. The formulation allows out-of-sample extensions which are useful for model selection in a learning setting. In this work, we propose a methodology to reveal the hierarchical structure present on the data. During the model selection stage, several clustering model parameters leading to good clusterings can be found. These results are then combined to display the underlying cluster hierarchies where the optimal depth of the tree is automatically determined. Simulations with toy data and real-life problems show the benefits of the proposed approach. (C) 2012 Elsevier Ltd. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据