4.6 Article

MAXIMUM CONDITIONAL ENTROPY HAMILTONIAN MONTE CARLO SAMPLER

Journal

SIAM JOURNAL ON SCIENTIFIC COMPUTING
Volume 43, Issue 5, Pages A3607-A3626

Publisher

SIAM PUBLICATIONS
DOI: 10.1137/20M1341192

Keywords

Hamiltonian Monte Carlo; Kolmogorov--Sinai entropy; Markov chain Monte Carlo

Funding

  1. NSFC [11771289]

Ask authors/readers for more resources

The paper introduces a design criterion based on KSE for optimizing algorithm parameters of HMC sampler, especially when the mass matrix is adapted. Analytically derivations of optimal algorithm parameters for near-Gaussian distributions are provided, as well as theoretical justification for adapting mass matrix in HMC sampler. An adaptive HMC algorithm is proposed and its performance demonstrated with numerical examples.
The performance of a Hamiltonian Monte Carlo (HMC) sampler depends critically on some algorithm parameters, such as the total integration time and the numerical integration stepsize. The parameter tuning is particularly challenging when the mass matrix of the HMC sampler is adapted. We propose in this work a Kolmogorov--Sinai entropy (KSE)--based design criterion to optimize these algorithm parameters, which can avoid some potential issues in the often-used jumping-distance--based measures. For near-Gaussian distributions, we are able to derive the optimal algorithm parameters with respect to the KSE criterion analytically. As a by-product, the KSE criterion also provides a theoretical justification for the need to adapt the mass matrix in the HMC sampler. Based on the results, we propose an adaptive HMC algorithm, and we then demonstrate the performance of the proposed algorithm with numerical examples.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available