4.5 Article

Nonparametric Estimation of Kullback-Leibler Divergence

期刊

NEURAL COMPUTATION
卷 26, 期 11, 页码 2570-2593

出版社

MIT PRESS
DOI: 10.1162/NECO_a_00646

关键词

-

向作者/读者索取更多资源

In this letter, we introduce an estimator of Kullback-Leibler divergence based on two independent samples. We show that on any finite alphabet, this estimator has an exponentially decaying bias and that it is consistent and asymptotically normal. To explain the importance of this estimator, we provide a thorough analysis of the more standard plug-in estimator. We show that it is consistent and asymptotically normal, but with an infinite bias. Moreover, if we modify the plug-in estimator to remove the rare events that cause the bias to become infinite, the bias still decays at a rate no faster than O(1/n). Further, we extend our results to estimating the symmetrized Kullback-Leibler divergence. We conclude by providing simulation results, which show that the asymptotic properties of these estimators hold even for relatively small sample sizes.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据