4.0 Article

Kullback-Leibler divergence for Bayesian nonparametric model checking

期刊

JOURNAL OF THE KOREAN STATISTICAL SOCIETY
卷 50, 期 1, 页码 272-289

出版社

SPRINGER HEIDELBERG
DOI: 10.1007/s42952-020-00072-7

关键词

Bayesian Non-parametric; Dirichlet process; Kullback-Leibler divergence; Model checking; Relative belief ratio

向作者/读者索取更多资源

Bayesian nonparametric statistics is a heavily researched area, with ongoing efforts to develop procedures for model checking using the Dirichlet process and Kullback-Leibler divergence. The approach proposed in this paper combines these tools along with the relative belief ratio to provide a concrete solution to the challenges posed by the discreteness of the Dirichlet process. Monte Carlo studies and real data examples demonstrate the excellent performance of this approach.
Bayesian nonparametric statistics is an area of considerable research interest. While recently there has been an extensive concentration in developing Bayesian nonparametric procedures for model checking, the use of the Dirichlet process, in its simplest form, along with the Kullback-Leibler divergence is still an open problem. This is mainly attributed to the discreteness property of the Dirichlet process and that the Kullback-Leibler divergence between any discrete distribution and any continuous distribution is infinity. The approach proposed in this paper, which is based on incorporating the Dirichlet process, the Kullback-Leibler divergence and the relative belief ratio, is considered the first concrete solution to this issue. Applying the approach is simple and does not require obtaining a closed form of the relative belief ratio. A Monte Carlo study and real data examples show that the developed approach exhibits excellent performance.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.0
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据