Journal
ANNALS OF STATISTICS
Volume 36, Issue 5, Pages 2153-2182Publisher
INST MATHEMATICAL STATISTICS
DOI: 10.1214/07-AOS539
Keywords
Entropy estimation; estimation of statistical distance; estimation of divergence; nearest-neighbor distances; Renyi entropy; Havrda-Charvat entropy; Tsallis entropy
Categories
Funding
- EPSRC [RCMT 119]
- European Community under the PASCAL Network of Excellence [IST-2002-506778]
- EPSRC [EP/D057361/1] Funding Source: UKRI
- Engineering and Physical Sciences Research Council [EP/D057361/1] Funding Source: researchfish
Ask authors/readers for more resources
A class of estimators of the Renyi and Tsallis entropies of an unknown distribution f in R-m is presented. These estimators are based on the kth nearest-neighbor distances computed from a sample of N i.i.d. vectors with distribution f. We show that entropies of any order q, including Shannon's entropy, can be estimated consistently with minimal assumptions on f. Moreover, we show that it is straightforward to extend the nearest-neighbor method to estimate the statistical distance between two distributions using one i.i.d. sample from each.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available