4.6 Article

Feature selection for support vector machines with RBF kernel

Journal

ARTIFICIAL INTELLIGENCE REVIEW
Volume 36, Issue 2, Pages 99-115

Publisher

SPRINGER
DOI: 10.1007/s10462-011-9205-2

Keywords

Feature selection; RBF kernel; Information gain; SVM-RFE; Recursive Feature Elimination

Funding

  1. National Natural Science Foundation of China [60873196]
  2. Chinese Universities Scientific Fund [QN2009092]

Ask authors/readers for more resources

Linear kernel Support Vector Machine Recursive Feature Elimination (SVMRFE) is known as an excellent feature selection algorithm. Nonlinear SVM is a black box classifier for which we do not know the mapping function Phi explicitly. Thus, the weight vector w cannot be explicitly computed. In this paper, we proposed a feature selection algorithm utilizing Support Vector Machine with RBF kernel based on Recursive Feature Elimination (SVM-RBF-RFE), which expands nonlinear RBF kernel into its Maclaurin series, and then the weight vector w is computed from the series according to the contribution made to classification hyperplane by each feature. Using w(i)(2) as ranking criterion, SVM-RBF-RFE starts with all the features, and eliminates one feature with the least squared weight at each step until all the features are ranked. We use SVM and KNN classifiers to evaluate nested subsets of features selected by SVM-RBF-RFE. Experimental results based on 3 UCI and 3 microarray datasets show SVM-RBF-RFE generally performs better than information gain and SVM-RFE.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available