4.4 Article

Interpreting Kullback-Leibler divergence with the Neyman-Pearson lemma

期刊

JOURNAL OF MULTIVARIATE ANALYSIS
卷 97, 期 9, 页码 2034-2040

出版社

ELSEVIER INC
DOI: 10.1016/j.jmva.2006.03.007

关键词

exponential connection; mixture connection; information geometry; testing hypotheses; maximum likelihood; ROC curve

向作者/读者索取更多资源

Kullback-Leibler divergence and the Neyman-Pearson lemma are two fundamental concepts in statistics. Both are about likelihood ratios: Kullback-Leibler divergence is the expected log-likelihood ratio, and the Neyman-Pearson lemma is about error rates of likelihood ratio tests. Exploring this connection gives another statistical interpretation of the Kullback-Leibler divergence in terms of the loss of power of the likelihood ratio test when the wrong distribution is used for one of the hypotheses. In this interpretation, the standard non-negativity property of the Kullback-Leibler divergence is essentially a restatement of the optimal property of likelihood ratios established by the Neyman-Pearson lemma. The asymmetry of Kullback-Leibler divergence is overviewed in information geometry. (c) 2006 Elsevier Inc. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.4
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据