4.2 Article

Optimization-Based Markov Chain Monte Carlo Methods for Nonlinear Hierarchical Statistical Inverse Problems

Journal

Publisher

SIAM PUBLICATIONS
DOI: 10.1137/20M1318365

Keywords

inverse problems; hierarchical Bayes; Markov chain Monte Carlo; pseudomarginalization; Poisson likelihood; positron emission tomography

Funding

  1. Gordon Preston Fellowship by Monash University
  2. Australian Research Council [CE140100049]
  3. Australian Research Council [CE140100049] Funding Source: Australian Research Council

Ask authors/readers for more resources

This work introduces scalable optimization-based Markov chain Monte Carlo (MCMC) methods for solving hierarchical Bayesian inverse problems with nonlinear parameter-to-observable maps and a broader class of hyperparameters. The algorithmic development is based on the recently developed scalable randomize-then-optimize (RTO) method and integrates RTO into Metropolis-within-Gibbs updates or pseudomarginal MCMC for efficient sampling in hierarchical Bayesian inversion. The integration of RTO and pseudomarginal MCMC provides sampling performance robust to model parameter dimensions, as demonstrated in numerical examples of PDE-constrained inverse problems and positron emission tomography.
In many hierarchical inverse problems, not only do we want to estimate high- or infinite-dimensional model parameters in the parameter-to-observable maps, but we also have to estimate hyperparameters that represent critical assumptions in the statistical and mathematical modeling processes. As a joint effect of high-dimensionality, nonlinear dependence, and nonconcave structures in the joint posterior distribution over model parameters and hyperparameters, solving inverse problems in the hierarchical Bayesian setting poses a significant computational challenge. In this work, we develop scalable optimization-based Markov chain Monte Carlo (MCMC) methods for solving hierarchical Bayesian inverse problems with nonlinear parameter-to-observable maps and a broader class of hyperparameters. Our algorithmic development is based on the recently developed scalable randomize-then-optimize (RTO) method [J. M. Bardsley et al., SIAM J. Sci. Comput., 42 (2016), pp. A1317-A1347] for exploring the high- or infinite-dimensional parameter space. We first extend the RTO machinery to the Poisson likelihood and discuss the implementation of RTO in the hierarchical setting. Then, by using RTO either as a proposal distribution in a Metropolis-withinGibbs update or as a biasing distribution in the pseudomarginal MCMC [C. Andrieu and G. O. Roberts, Ann. Statist., 37 (2009), pp. 697-725], we present efficient sampling tools for hierarchical Bayesian inversion. In particular, the integration of RTO and the pseudomarginal MCMC has sampling performance robust to model parameter dimensions. Numerical examples in PDE-constrained inverse problems and positron emission tomography are used to demonstrate the performance of our methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.2
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available