期刊
NEURAL COMPUTING & APPLICATIONS
卷 21, 期 7, 页码 1791-1799出版社
SPRINGER
DOI: 10.1007/s00521-012-0827-3
关键词
Feature selection; K-means clustering; Maximum margin criterion; Regularization
资金
- National Natural Science Foundation of China [60975038, 60105003]
In this article, we present an unsupervised maximum margin feature selection algorithm via sparse constraints. The algorithm combines feature selection and K-means clustering into a coherent framework. L (2,1)-norm regularization is performed to the transformation matrix to enable feature selection across all data samples. Our method is equivalent to solving a convex optimization problem and is an iterative algorithm that converges to an optimal solution. The convergence analysis of our algorithm is also provided. Experimental results demonstrate the efficiency of our algorithm.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据