Journal
JOURNAL OF MULTIVARIATE ANALYSIS
Volume 97, Issue 9, Pages 2034-2040Publisher
ELSEVIER INC
DOI: 10.1016/j.jmva.2006.03.007
Keywords
exponential connection; mixture connection; information geometry; testing hypotheses; maximum likelihood; ROC curve
Categories
Ask authors/readers for more resources
Kullback-Leibler divergence and the Neyman-Pearson lemma are two fundamental concepts in statistics. Both are about likelihood ratios: Kullback-Leibler divergence is the expected log-likelihood ratio, and the Neyman-Pearson lemma is about error rates of likelihood ratio tests. Exploring this connection gives another statistical interpretation of the Kullback-Leibler divergence in terms of the loss of power of the likelihood ratio test when the wrong distribution is used for one of the hypotheses. In this interpretation, the standard non-negativity property of the Kullback-Leibler divergence is essentially a restatement of the optimal property of likelihood ratios established by the Neyman-Pearson lemma. The asymmetry of Kullback-Leibler divergence is overviewed in information geometry. (c) 2006 Elsevier Inc. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available