4.7 Article

Statistical Estimation of the Kullback-Leibler Divergence

期刊

MATHEMATICS
卷 9, 期 5, 页码 -

出版社

MDPI
DOI: 10.3390/math9050544

关键词

Kullback-Leibler divergence; Shannon differential entropy; statistical estimates; k-nearest neighbor statistics; asymptotic behavior; Gaussian model; mixtures

资金

  1. Russian Science Foundation [14-21-00162]
  2. Russian Science Foundation [19-11-00290] Funding Source: Russian Science Foundation

向作者/读者索取更多资源

In this study, it is proven under certain conditions that the estimates based on k-nearest neighbor statistics are asymptotically unbiased and L-2 consistent for the Kullback-Leibler divergence between two probability measures in Rd that are absolutely continuous with respect to the Lebesgue measure. The novelty of the results lies in the treatment of mixture models, particularly including mixtures of nondegenerate Gaussian measures. Moreover, the asymptotic properties of related estimators for the Shannon entropy and cross-entropy are further strengthened.
Asymptotic unbiasedness and L-2-consistency are established, under mild conditions, for the estimates of the Kullback-Leibler divergence between two probability measures in Rd, absolutely continuous with respect to (w.r.t.) the Lebesgue measure. These estimates are based on certain k-nearest neighbor statistics for pair of independent identically distributed (i.i.d.) due vector samples. The novelty of results is also in treating mixture models. In particular, they cover mixtures of nondegenerate Gaussian measures. The mentioned asymptotic properties of related estimators for the Shannon entropy and cross-entropy are strengthened. Some applications are indicated.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据