4.6 Article

Sparse least square support vector machine via coupled compressive pruning

Journal

NEUROCOMPUTING
Volume 131, Issue -, Pages 77-86

Publisher

ELSEVIER SCIENCE BV
DOI: 10.1016/j.neucom.2013.10.038

Keywords

Least square support vector machine; Compressive pruning; Sparse representation; Random measurement

Funding

  1. National Basic Research Program of China (973 Program) [2013CB329402]
  2. National Science Foundation of China [61072108, 61072106, 61271290, 51207002, 61173090, 61272282]
  3. National Research Foundation for the Doctoral Program of Higher Education of China [9140A24070412DZ0101, 20120203110005, 20110203110006]
  4. Program for New Century Excellent Talents in University [NCET-10-0668]
  5. Fund for Foreign Scholars in University Research and Teaching Programs (the 111 Project) [B07048]
  6. Program for Cheung Kong Scholars and Innovative Research Team in University [IRT1170]
  7. New-Star of Science and Technology supported by Shaanxi Province [2013KJXX-63]

Ask authors/readers for more resources

Among the support vector machines, Least Square Support Vector Machine (LSSVM) is computationally attractive for reducing a set of inequality constraints to linear equations. Several pruning algorithms have been developed to obtain reduced support vectors to improve the generalization performance of LSSVM. However, most of the pruning algorithms iteratively select the support vectors, which is of high computation complexity. In this paper, inspired by the recently developed compressive sampling theory, a one-step compressive pruning strategy is proposed to construct a sparse LSSVM without the remarkable reduction of accuracy. It is a fast, universal and information-preserved pruning approach that can avoid the intensive computations in iterative retraining. Some experiments on pattern recognition and function approximation are taken to compare the proposed method with the available pruning approaches, and the results show the feasibility of the proposed method and the superiority to its counterparts. (C) 2013 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available