4.6 Article

Unsupervised feature selection based on joint spectral learning and general sparse regression

Journal

NEURAL COMPUTING & APPLICATIONS
Volume 32, Issue 11, Pages 6581-6589

Publisher

SPRINGER LONDON LTD
DOI: 10.1007/s00521-019-04117-9

Keywords

Spectral selection; General sparse regression; Unsupervised feature selection

Funding

  1. National Key RAMP
  2. D Program of China [2017YFC0820604]
  3. Anhui Provincial Natural Science Foundation [1808085QF188]
  4. National Nature Science Foundation of China [61702156, 61772171, 61876056]

Ask authors/readers for more resources

Unsupervised feature selection is an important machine learning task since the manual annotated data are dramatically expensive to obtain and therefore very limited. However, due to the existence of noise and outliers in different data samples, feature selection without the help of discriminant information embedded in the annotated data is quite challenging. To relieve these limitations, we investigate the embedding of spectral learning into a general sparse regression framework for unsupervised feature selection. Generally, the proposed general spectral sparse regression (GSSR) method handles the outlier features by learning the joint sparsity and the noisy features by preserving the local structures of data, jointly. Specifically, GSSR is conducted in two stages. First, the classic sparse dictionary learning method is used to build the bases of original data. After that, the original data are project to the basis space by learning a new representation via GSSR. In GSSR, robust loss function l2,r-norm(0<= 2) and l2,p-norm(0 <= 1) instead of the traditional F norm and least square loss function are simultaneously considered as the reconstruction term and sparse regularization term for sparse regression. Furthermore, the local topological structures of the new representations are preserved by spectral learning based on the Laplacian term. The overall objective function in GSSR is optimized and proved to be converging. Experimental results on several publicly datasets have demonstrated the validity of our algorithm, which outperformed the state-of-the-art feature selections in terms of classification performance.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available