4.6 Article

Prior normalization for certified likelihood-informed subspace detection of Bayesian inverse problems

Related references

Note: Only part of the references are listed.
Article Mathematics, Applied

CERTIFIED DIMENSION REDUCTION IN NONLINEAR BAYESIAN INVERSE PROBLEMS

Olivier Zahm et al.

Summary: This paper proposes a dimension reduction technique for Bayesian inverse problems with nonlinear forward operators, non-Gaussian priors, and non-Gaussian observation noise. The likelihood function is approximated by a ridge function, and the ridge approximation is built by minimizing an upper bound on the Kullback-Leibler divergence between the posterior distribution and its approximation. The paper provides an analysis that enables control of the posterior approximation error due to sampling.

MATHEMATICS OF COMPUTATION (2022)

Article Computer Science, Theory & Methods

Cauchy Markov random field priors for Bayesian inversion

Jarkko Suuronen et al.

Summary: This paper reviews and compares the recently developed Cauchy difference priors, introduces interesting new variants, and provides models and methods for one-dimensional and two-dimensional deconvolution problems. It is found that the use of Cauchy Markov random field priors can lead to posterior distributions that are non-Gaussian, high-dimensional, multimodal, and heavy-tailed.

STATISTICS AND COMPUTING (2022)

Article Statistics & Probability

A unified performance analysis of likelihood-informed subspace methods

Tiangang Cui et al.

Summary: The likelihood-informed subspace (LIS) method provides a way to reduce the dimensionality of high-dimensional probability distributions for Bayesian inference. This study establishes a unified framework to analyze the accuracy of dimension reduction techniques and the integration with sampling methods. The results demonstrate the effectiveness and applicability of the LIS method in various scenarios.

BERNOULLI (2022)

Article Mathematics, Applied

Nonlinear dimension reduction for surrogate modeling using gradient information

Daniele Bigoni et al.

Summary: This article introduces a method for nonlinear dimension reduction of high-dimensional functions. By building a nonlinear feature map, the function can be accurately approximated. The authors propose aligning the Jacobian matrix to construct the feature map and solve a gradient-enhanced least squares problem for the profile function. Experimental results show that a nonlinear feature map can provide a more accurate approximation of the function compared to a linear feature map.

INFORMATION AND INFERENCE-A JOURNAL OF THE IMA (2022)

Article Mathematics, Applied

Data-free likelihood-informed dimension reduction of Bayesian inverse problems

Tiangang Cui et al.

Summary: Identifying a low-dimensional informed parameter subspace is a viable way to address the dimensionality challenge in sampled-based solutions to large-scale Bayesian inverse problems. The introduced gradient-based method allows for offline detection of the expensive low-dimensional structure before observing the data, enabling efficient control over the approximation error of the posterior distribution. Sampling strategies are presented to draw samples from the exact posterior distribution using the informed subspace.

INVERSE PROBLEMS (2021)

Article Engineering, Multidisciplinary

Randomized residual-based error estimators for the proper generalized decomposition approximation of parametrized problems

Kathrin Smetana et al.

INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING (2020)

Article Mathematics, Applied

Cauchy difference priors for edge-preserving Bayesian inversion

Markku Markkanen et al.

JOURNAL OF INVERSE AND ILL-POSED PROBLEMS (2019)

Article Mathematics, Applied

Simple nonlinear models with rigorous extreme events and heavy tails

Andrew J. Majda et al.

NONLINEARITY (2019)

Article Multidisciplinary Sciences

Sampling can be faster than optimization

Yi-An Ma et al.

PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA (2019)

Article Computer Science, Interdisciplinary Applications

Scalable posterior approximations for large-scale Bayesian inverse problems via likelihood-informed parameter and state reduction

Tiangang Cui et al.

JOURNAL OF COMPUTATIONAL PHYSICS (2016)

Article Computer Science, Interdisciplinary Applications

Dimension-independent likelihood-informed MCMC

Tiangang Cui et al.

JOURNAL OF COMPUTATIONAL PHYSICS (2016)

Article Mathematics, Applied

Intermittency in turbulent diffusion models with a mean gradient

Andrew J. Majda et al.

NONLINEARITY (2015)

Article Mathematics, Applied

Likelihood-informed dimension reduction for nonlinear inverse problems

T. Cui et al.

INVERSE PROBLEMS (2014)

Article Mathematics, Applied

BESOV PRIORS FOR BAYESIAN INVERSE PROBLEMS

Masoumeh Dashti et al.

INVERSE PROBLEMS AND IMAGING (2012)

Article Mathematics, Applied

A STOCHASTIC NEWTON MCMC METHOD FOR LARGE-SCALE STATISTICAL INVERSE PROBLEMS WITH APPLICATION TO SEISMIC INVERSION

James Martin et al.

SIAM JOURNAL ON SCIENTIFIC COMPUTING (2012)

Review Statistics & Probability

Riemann manifold Langevin and Hamiltonian Monte Carlo methods

Mark Girolami et al.

JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY (2011)

Article Mathematics, Applied

Trace optimization and eigenproblems in dimension reduction methods

E. Kokiopoulou et al.

NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS (2011)

Article Mathematics, Applied

DISCRETIZATION-INVARIANT BAYESIAN INVERSION AND BESOV SPACE PRIORS

Matti Lassas et al.

INVERSE PROBLEMS AND IMAGING (2009)

Article Engineering, Mechanical

An innovating analysis of the Nataf transformation from the copula viewpoint

Regis Lebrun et al.

PROBABILISTIC ENGINEERING MECHANICS (2009)

Article Statistics & Probability

An adaptive version for the Metropolis Adjusted Langevin algorithm with a truncated drift

Yves F. Atchade

METHODOLOGY AND COMPUTING IN APPLIED PROBABILITY (2006)

Article Statistics & Probability

Optimal scaling for various Metropolis-Hastings algorithms

GO Roberts et al.

STATISTICAL SCIENCE (2001)

Article Statistics & Probability

An adaptive Metropolis algorithm

H Haario et al.

BERNOULLI (2001)