4.1 Article

Bayesian support vector regression using a unified loss function

Journal

IEEE TRANSACTIONS ON NEURAL NETWORKS
Volume 15, Issue 1, Pages 29-44

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNN.2003.820830

Keywords

automatic relevance determination; Bayesian inference; Gaussian processes; model selection; nonquadratic loss function; support vector regression

Funding

  1. NATIONAL INSTITUTE OF GENERAL MEDICAL SCIENCES [P01GM063208] Funding Source: NIH RePORTER
  2. NIGMS NIH HHS [1 P01 GM63208] Funding Source: Medline

Ask authors/readers for more resources

In this paper, we use a unified loss function, called the soft insensitive loss function, for Bayesian support vector regression. We follow standard Gaussian processes for regression to set up the Bayesian framework, in which the unified loss function is used in the likelihood evaluation. Under this framework, the maximum a posteriori estimate of the function values corresponds to the solution of an extended support vector regression problem. The overall approach has the merits of support vector regression such as convex quadratic programming and sparsity in solution representation. It also has the advantages of Bayesian methods for model adaptation and error bars of its predictions. Experimental results on simulated and real-world data sets indicate that the approach works well even on large data sets.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.1
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available