4.5 Article

Unsupervised feature selection by self -paced learning regularization

期刊

PATTERN RECOGNITION LETTERS
卷 132, 期 -, 页码 4-11

出版社

ELSEVIER
DOI: 10.1016/j.patrec.2018.06.029

关键词

Feature selection; Self-paced learning; Robust statistic

资金

  1. China Key Research Program [2016YFB1000905]
  2. National Natural Science Foundation of China [661573270, 61672177]
  3. Project of Guangxi Science and Technology [GuiKeAD17195062]
  4. Guangxi Natural Science Foundation [2015GXNSFCB139011]
  5. Innovation Project of Guangxi Graduate Education [YCSW2018093]
  6. Guangxi Collaborative Innovation Center of Multi-Source Information Integration and Intelligent Processing
  7. Guangxi Bagui Teams for Innovation and Research
  8. Guangxi High Institutions Program of Introducing 100 High-Level Overseas Talents
  9. Research Fund of Guangxi Key Lab of Multi-source Information Mining Security [18-A-01-01]

向作者/读者索取更多资源

Previous feature selection methods equivalently consider the samples to select important features. However, the samples are often diverse. For example, the outliers should have small or even zero weights while the important samples should have large weights. In this paper, we add a self-paced regularization in the sparse feature selection model to reduce the impact of outliers for conducting feature selection. Specifically, the proposed method automatically selects a sample subset which includes the most important samples to build an initial feature selection model, whose generalization ability is then improved by involving other important samples until a robust and generalized feature selection model has been established or all the samples have been used. Experimental results on eight real datasets show that the proposed method outperforms the comparison methods. (c) 2018 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据