4.5 Article

Probabilistic programming with stochastic variational message passing

Journal

INTERNATIONAL JOURNAL OF APPROXIMATE REASONING
Volume 148, Issue -, Pages 235-252

Publisher

ELSEVIER SCIENCE INC
DOI: 10.1016/j.ijar.2022.06.006

Keywords

Factor graphs; Message passing; Natural gradient descent; Probabilistic programming; Variational inference

Funding

  1. GN Hearing
  2. Dutch Ministry of Economic Affairs

Ask authors/readers for more resources

Stochastic approximation methods for variational inference have gained popularity in probabilistic programming for their automation, online scalability, and universal approximate Bayesian inference. However, current Probabilistic Programming Languages (PPLs) with stochastic approximation engines lack the efficiency of message passing algorithms with deterministic update rules. This paper casts Stochastic Variational Inference (SVI) and Conjugate-Computation Variational Inference (CVI) explicitly in a message passing context, providing an implementation in ForneyLab that extends the automated inference capabilities of message passing-based probabilistic programming.
Stochastic approximation methods for variational inference have recently gained popularity in the probabilistic programming community since these methods are amenable to automation and allow online, scalable, and universal approximate Bayesian inference. Unfortunately, common Probabilistic Programming Languages (PPLs) with stochastic approximation engines lack the efficiency of message passing-based inference algorithms with deterministic update rules such as Belief Propagation (BP) and Variational Message Passing (VMP). Still, Stochastic Variational Inference (SVI) and Conjugate-Computation Variational Inference (CVI) provide principled methods to integrate fast deterministic inference techniques with broadly applicable stochastic approximate inference. Unfortunately, implementation of SVI and CVI necessitates manually driven variational update rules, which does not yet exist in most PPLs. In this paper, we cast SVI and CVI explicitly in a message passing-based inference context. We provide an implementation for SVI and CVI in ForneyLab, which is an automated message passing-based probabilistic programming package in the open source Julia language. Through a number of experiments, we demonstrate how SVI and CVI extends the automated inference capabilities of message passing-based probabilistic programming. (C) 2022 The Author(s). Published by Elsevier Inc.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available