Journal
IEEE TRANSACTIONS ON NEURAL NETWORKS
Volume 15, Issue 1, Pages 29-44Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNN.2003.820830
Keywords
automatic relevance determination; Bayesian inference; Gaussian processes; model selection; nonquadratic loss function; support vector regression
Categories
Funding
- NATIONAL INSTITUTE OF GENERAL MEDICAL SCIENCES [P01GM063208] Funding Source: NIH RePORTER
- NIGMS NIH HHS [1 P01 GM63208] Funding Source: Medline
Ask authors/readers for more resources
In this paper, we use a unified loss function, called the soft insensitive loss function, for Bayesian support vector regression. We follow standard Gaussian processes for regression to set up the Bayesian framework, in which the unified loss function is used in the likelihood evaluation. Under this framework, the maximum a posteriori estimate of the function values corresponds to the solution of an extended support vector regression problem. The overall approach has the merits of support vector regression such as convex quadratic programming and sparsity in solution representation. It also has the advantages of Bayesian methods for model adaptation and error bars of its predictions. Experimental results on simulated and real-world data sets indicate that the approach works well even on large data sets.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available