4.5 Article

The Variational Gaussian Approximation Revisited

Journal

NEURAL COMPUTATION
Volume 21, Issue 3, Pages 786-792

Publisher

MIT PRESS
DOI: 10.1162/neco.2008.08-07-592

Keywords

-

Funding

  1. EPSRC [EP/C005740/1] Funding Source: UKRI
  2. Engineering and Physical Sciences Research Council [EP/C005740/1] Funding Source: researchfish

Ask authors/readers for more resources

The variational approximation of posterior distributions by multivariate gaussians has been much less popular in the machine learning community compared to the corresponding approximation by factorizing distributions. This is for a good reason: the gaussian approximation is in general plagued by an O(N-2) number of variational parameters to be optimized, N being the number of random variables. In this letter, we discuss the relationship between the Laplace and the variational approximation, and we show that for models with gaussian priors and factorizing likelihoods, the number of variational parameters is actually O(N). The approach is applied to gaussian process regression with nongaussian likelihoods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available