4.6 Article

Unsupervised maximum margin feature selection via L 2,1-norm minimization

期刊

NEURAL COMPUTING & APPLICATIONS
卷 21, 期 7, 页码 1791-1799

出版社

SPRINGER
DOI: 10.1007/s00521-012-0827-3

关键词

Feature selection; K-means clustering; Maximum margin criterion; Regularization

资金

  1. National Natural Science Foundation of China [60975038, 60105003]

向作者/读者索取更多资源

In this article, we present an unsupervised maximum margin feature selection algorithm via sparse constraints. The algorithm combines feature selection and K-means clustering into a coherent framework. L (2,1)-norm regularization is performed to the transformation matrix to enable feature selection across all data samples. Our method is equivalent to solving a convex optimization problem and is an iterative algorithm that converges to an optimal solution. The convergence analysis of our algorithm is also provided. Experimental results demonstrate the efficiency of our algorithm.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据