4.5 Article

Unsupervised feature selection by self -paced learning regularization

Journal

PATTERN RECOGNITION LETTERS
Volume 132, Issue -, Pages 4-11

Publisher

ELSEVIER
DOI: 10.1016/j.patrec.2018.06.029

Keywords

Feature selection; Self-paced learning; Robust statistic

Funding

  1. China Key Research Program [2016YFB1000905]
  2. National Natural Science Foundation of China [661573270, 61672177]
  3. Project of Guangxi Science and Technology [GuiKeAD17195062]
  4. Guangxi Natural Science Foundation [2015GXNSFCB139011]
  5. Innovation Project of Guangxi Graduate Education [YCSW2018093]
  6. Guangxi Collaborative Innovation Center of Multi-Source Information Integration and Intelligent Processing
  7. Guangxi Bagui Teams for Innovation and Research
  8. Guangxi High Institutions Program of Introducing 100 High-Level Overseas Talents
  9. Research Fund of Guangxi Key Lab of Multi-source Information Mining Security [18-A-01-01]

Ask authors/readers for more resources

Previous feature selection methods equivalently consider the samples to select important features. However, the samples are often diverse. For example, the outliers should have small or even zero weights while the important samples should have large weights. In this paper, we add a self-paced regularization in the sparse feature selection model to reduce the impact of outliers for conducting feature selection. Specifically, the proposed method automatically selects a sample subset which includes the most important samples to build an initial feature selection model, whose generalization ability is then improved by involving other important samples until a robust and generalized feature selection model has been established or all the samples have been used. Experimental results on eight real datasets show that the proposed method outperforms the comparison methods. (c) 2018 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available