Journal
SIAM JOURNAL ON SCIENTIFIC COMPUTING
Volume 44, Issue 4, Pages A1884-A1910Publisher
SIAM PUBLICATIONS
DOI: 10.1137/21M1449051
Keywords
multilevel Monte Carlo; quasi-Monte Carlo; variational Bayes; intractable likelihood; nested simulation
Categories
Funding
- National Science Foundation of China [720711119, 12071154]
- Guangdong Basic and Applied Basic Research Foundation [2021A1515010275]
- Guangzhou Science and Technology Program [202102020407]
Ask authors/readers for more resources
This paper proposes a new variational Bayes algorithm based on multilevel Monte Carlo and randomized quasi-Monte Carlo sampling, for handling inference problems with intractable likelihood functions. Compared to existing algorithms, this algorithm has better performance and convergence rate.
Variational Bayes (VB) is a popular tool for Bayesian inference in statistical modeling. Recently, some VB algorithms were proposed to handle intractable likelihoods with applications such as approximate Bayesian computation. In this paper, we propose several unbiased estimators based on multilevel Monte Carlo (MLMC) for the gradient of Kullback-Leibler divergence between the posterior distribution and the variational distribution when the likelihood is intractable, but can be estimated unbiasedly. The new VB algorithm differs from the VB algorithms in the literature which usually render biased gradient estimators. Moreover, we incorporate randomized quasi-Monte Carlo (RQMC) sampling within the MLMC-based gradient estimators, which was known to provide a favorable rate of convergence in numerical integration. Theoretical guarantees for RQMC are provided in this new setting. Numerical experiments show that using RQMC in MLMC greatly speeds up the VB algorithm, and finds a better parameter value than some existing competitors do.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available