4.5 Article

Nonparametric Estimation of Kullback-Leibler Divergence

Journal

NEURAL COMPUTATION
Volume 26, Issue 11, Pages 2570-2593

Publisher

MIT PRESS
DOI: 10.1162/NECO_a_00646

Keywords

-

Ask authors/readers for more resources

In this letter, we introduce an estimator of Kullback-Leibler divergence based on two independent samples. We show that on any finite alphabet, this estimator has an exponentially decaying bias and that it is consistent and asymptotically normal. To explain the importance of this estimator, we provide a thorough analysis of the more standard plug-in estimator. We show that it is consistent and asymptotically normal, but with an infinite bias. Moreover, if we modify the plug-in estimator to remove the rare events that cause the bias to become infinite, the bias still decays at a rate no faster than O(1/n). Further, we extend our results to estimating the symmetrized Kullback-Leibler divergence. We conclude by providing simulation results, which show that the asymptotic properties of these estimators hold even for relatively small sample sizes.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available