4.7 Article

Experimentally optimal nu in support vector regression for different noise models and parameter settings

期刊

NEURAL NETWORKS
卷 17, 期 1, 页码 127-141

出版社

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/S0893-6080(03)00209-0

关键词

support vector machines; nu-support vector machines; support vector regression; support vector machine parameters; optimal nu; Gaussian kernel; model selection; risk minimization

向作者/读者索取更多资源

In Support Vector (SV) regression, a parameter nu controls the number of Support Vectors and the number of points that come to lie outside of the so-called epsilon-insensitive tube. For various noise models and SV parameter settings, we experimentally determine the values of nu that lead to the lowest generalization error. We find good agreement with the values that had previously been predicted by a theoretical argument based on the asymptotic efficiency of a simplified model of SV regression. As a side effect of the experiments, valuable information about the generalization behavior of the remaining SVM parameters and their dependencies is gained. The experimental findings are valid even for complex 'real-world' data sets. Based on our results on the role of the nu-SVM parameters, we discuss various model selection methods. (C) 2003 Published by Elsevier Ltd.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据