4.7 Article

Statistical Estimation of the Kullback-Leibler Divergence

Journal

MATHEMATICS
Volume 9, Issue 5, Pages -

Publisher

MDPI
DOI: 10.3390/math9050544

Keywords

Kullback-Leibler divergence; Shannon differential entropy; statistical estimates; k-nearest neighbor statistics; asymptotic behavior; Gaussian model; mixtures

Categories

Funding

  1. Russian Science Foundation [14-21-00162]
  2. Russian Science Foundation [19-11-00290] Funding Source: Russian Science Foundation

Ask authors/readers for more resources

In this study, it is proven under certain conditions that the estimates based on k-nearest neighbor statistics are asymptotically unbiased and L-2 consistent for the Kullback-Leibler divergence between two probability measures in Rd that are absolutely continuous with respect to the Lebesgue measure. The novelty of the results lies in the treatment of mixture models, particularly including mixtures of nondegenerate Gaussian measures. Moreover, the asymptotic properties of related estimators for the Shannon entropy and cross-entropy are further strengthened.
Asymptotic unbiasedness and L-2-consistency are established, under mild conditions, for the estimates of the Kullback-Leibler divergence between two probability measures in Rd, absolutely continuous with respect to (w.r.t.) the Lebesgue measure. These estimates are based on certain k-nearest neighbor statistics for pair of independent identically distributed (i.i.d.) due vector samples. The novelty of results is also in treating mixture models. In particular, they cover mixtures of nondegenerate Gaussian measures. The mentioned asymptotic properties of related estimators for the Shannon entropy and cross-entropy are strengthened. Some applications are indicated.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available