4.7 Article

Monotonic Alpha-divergence Minimisation for Variational Inference

期刊

出版社

MICROTOME PUBL

关键词

Variational Inference; Kullback-Leibler; Alpha-Divergence; Mixture Models; Bayesian Inference

向作者/读者索取更多资源

This paper introduces a novel family of iterative algorithms for alpha-divergence minimisation in a Variational Inference context. The algorithms ensure a systematic decrease in the alpha-divergence between the variational and the posterior distributions. The approach allows for simultaneous optimization of the weights and components parameters of the mixture model, and shows improved results on multimodal target distributions and real data examples.
In this paper, we introduce a novel family of iterative algorithms which carry out alpha-divergence minimisation in a Variational Inference context. They do so by ensuring a systematic decrease at each step in the alpha-divergence between the variational and the posterior distributions. In its most general form, the variational distribution is a mixture model and our framework allows us to simultaneously optimise the weights and components parameters of this mixture model. Our approach permits us to build on various methods previously proposed for alpha-divergence minimisation such as Gradient or Power Descent schemes and we also shed a new light on an integrated Expectation Maximization algorithm. Lastly, we provide empirical evidence that our methodology yields improved results on several multimodal target distributions and on a real data example.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据