4.7 Article

Experimentally optimal nu in support vector regression for different noise models and parameter settings

Journal

NEURAL NETWORKS
Volume 17, Issue 1, Pages 127-141

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/S0893-6080(03)00209-0

Keywords

support vector machines; nu-support vector machines; support vector regression; support vector machine parameters; optimal nu; Gaussian kernel; model selection; risk minimization

Ask authors/readers for more resources

In Support Vector (SV) regression, a parameter nu controls the number of Support Vectors and the number of points that come to lie outside of the so-called epsilon-insensitive tube. For various noise models and SV parameter settings, we experimentally determine the values of nu that lead to the lowest generalization error. We find good agreement with the values that had previously been predicted by a theoretical argument based on the asymptotic efficiency of a simplified model of SV regression. As a side effect of the experiments, valuable information about the generalization behavior of the remaining SVM parameters and their dependencies is gained. The experimental findings are valid even for complex 'real-world' data sets. Based on our results on the role of the nu-SVM parameters, we discuss various model selection methods. (C) 2003 Published by Elsevier Ltd.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available