4.3 Article

MCMC diagnostics for higher dimensions using Kullback Leibler divergence

期刊

出版社

TAYLOR & FRANCIS LTD
DOI: 10.1080/00949655.2017.1335313

关键词

Adaptive kernel density estimation; convergence diagnostics; Kullback Leibler divergence; MCMC; Monte Carlo

向作者/读者索取更多资源

In the existing literature of MCMC diagnostics, we have identified two areas for improvement. Firstly, the density-based diagnostic tools currently available in the literature are not equipped to assess the joint convergence of multiple variables. Secondly, in case of multi-modal target distribution if the MCMC sampler gets stuck in one of the modes, then the current diagnostic tools may falsely detect convergence. The Tool 1 proposed in this article makes use of adaptive kernel density estimation, symmetric Kullback Leibler divergence and a testing of hypothesis framework to assess the joint convergence of multiple variables. In cases where Tool 1 detects divergence of multiple chains, started at distinct initial values, we propose a visualization tool that can help to investigate reasons behind their divergence. The Tool 2 proposed in this article makes a novel use of the target distribution (known up till the unknown normalizing constant), to detect divergence when an MCMC sampler gets stuck in one of the modes of a multi-modal target distribution. The usefulness of the tools proposed in this article is illustrated using a multi-modal distribution, a mixture of bivariate normal distribution and a Bayesian logit model example.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.3
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据