4.5 Article

A path sampling identity for computing the Kullback-Leibler and J divergences

期刊

COMPUTATIONAL STATISTICS & DATA ANALYSIS
卷 54, 期 7, 页码 1719-1731

出版社

ELSEVIER SCIENCE BV
DOI: 10.1016/j.csda.2010.01.018

关键词

Auxiliary density; Geometric path; J divergence; Kullback-Leibler divergence; Model selection; Normalizing constant; Path sampling

资金

  1. Natural Sciences and Engineering Research Council of Canada (NSERC)

向作者/读者索取更多资源

Estimating normalizing constants is a common and often difficult problem in statistics, and path sampling (PS) is among the most powerful methods that have been put forward to this end. Using an identity that arises in the formulation of PS, we derive expressions for the Kullback-Leibler (KL) and J divergences between two distributions from possibly different parametric families These expressions naturally stem from PS when the geometric path is used to link the two extreme densities We examine the use of the KL and J divergence measures in PS in a variety of model selection examples In this context, one challenging aspect of PS is that of selecting an appropriate auxiliary density that will yield a high quality estimate of the marginal likelihood without incurring excessive computational effort The J divergence is shown to be helpful for choosing auxiliary densities that minimize the error of the PS estimators These results increase appreciably the usefulness of PS (C) 2010 Elsevier B V All rights reserved

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据