4.7 Article

Bayesian methods for support vector machines: Evidence and predictive class probabilities

期刊

MACHINE LEARNING
卷 46, 期 1-3, 页码 21-52

出版社

SPRINGER
DOI: 10.1023/A:1012489924661

关键词

Support vector machines; Gaussian processes; Bayesian inference; evidence; hyperparameter tuning; probabilistic predictions

向作者/读者索取更多资源

I describe a framework for interpreting Support Vector Machines (SVMs) as maximum a posteriori (MAP) solutions to inference problems with Gaussian Process priors. This probabilistic interpretation can provide intuitive guidelines for choosing a 'good' SVM kernel. Beyond this, it allows Bayesian methods to be used for tackling two of the outstanding challenges in SVM classification: how to tune hyperparameters-the misclassification penalty C, and any parameters specifying the ernel-and how to obtain predictive class probabilities rather than the conventional deterministic class label predictions. Hyperparameters can be set by maximizing the evidence; I explain how the latter can be defined and properly normalized. Both analytical approximations and numerical methods (Monte Carlo chaining) for estimating the evidence are discussed. I also compare different methods of estimating class probabilities, ranging from simple evaluation at the MAP or at the posterior average to full averaging over the posterior. A simple toy application illustrates the various concepts and techniques.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据