4.7 Article

Semi-Supervised Feature Selection via Sparse Rescaled Linear Square Regression

Journal

Publisher

IEEE COMPUTER SOC
DOI: 10.1109/TKDE.2018.2879797

Keywords

Feature extraction; Computational complexity; Laplace equations; Knowledge discovery; Data engineering; Iterative methods; Adaptation models; Feature selection; semi-supervised feature selection; sparse feature selection; least square regression

Funding

  1. NSFC [61773268, U1636202, 61836005]
  2. Tencent Rhinoceros Birds - Scientific Research Foundation for Young Teachers of Shenzhen University

Ask authors/readers for more resources

With the rapid increase of the data size, it has increasing demands for selecting features by exploiting both labeled and unlabeled data. In this paper, we propose a novel semi-supervised embedded feature selection method. The new method extends the least square regression model by rescaling the regression coefficients in the least square regression with a set of scale factors, which is used for evaluating the importance of features. An iterative algorithm is proposed to optimize the new model. It has been proved that solving the new model is equivalent to solving a sparse model with a flexible and adaptable l(2,p) norm regularization. Moreover, the optimal solution of scale factors provides a theoretical explanation for why we can use {parallel to w(1)parallel to(2), ..., parallel to w(d)parallel to(2)} to evaluate the importance of features. Experimental results on eight benchmark data sets show the superior performance of the proposed method.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available