4.6 Article

A Kullback-Leibler View of Maximum Entropy and Maximum Log-Probability Methods

期刊

ENTROPY
卷 19, 期 5, 页码 -

出版社

MDPI AG
DOI: 10.3390/e19050232

关键词

entropy; minimum cross entropy; joint probability distribution

资金

  1. National Science Foundation [CMMI 15-65168, CMMI 16-29752, CMMI 16-44991]

向作者/读者索取更多资源

Entropy methods enable a convenient general approach to providing a probability distribution with partial information. The minimum cross-entropy principle selects the distribution that minimizes the Kullback-Leibler divergence subject to the given constraints. This general principle encompasses a wide variety of distributions, and generalizes other methods that have been proposed independently. There remains, however, some confusion about the breadth of entropy methods in the literature. In particular, the asymmetry of the Kullback-Leibler divergence provides two important special cases when the target distribution is uniform: the maximum entropy method and the maximum log-probability method. This paper compares the performance of both methods under a variety of conditions. We also examine a generalized maximum log-probability method as a further demonstration of the generality of the entropy approach.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据