4.6 Article

An Axiomatic Characterization of Mutual Information

期刊

ENTROPY
卷 25, 期 4, 页码 -

出版社

MDPI
DOI: 10.3390/e25040663

关键词

shannon theory; information measures; mutual information

向作者/读者索取更多资源

We define mutual information as a map on ordered pairs of discrete random variables satisfying a set of axioms, similar to the characterization of Shannon entropy by Faddeev. Our characterization introduces a new axiom based on the concept of a Markov triangle, which represents a composition of communication channels where conditional entropy acts functorially. Our proofs do not involve logarithms.
We characterize mutual information as the unique map on ordered pairs of discrete random variables satisfying a set of axioms similar to those of Faddeev's characterization of the Shannon entropy. There is a new axiom in our characterization, however, which has no analog for Shannon entropy, based on the notion of a Markov triangle, which may be thought of as a composition of communication channels for which conditional entropy acts functorially. Our proofs are coordinate-free in the sense that no logarithms appear in our calculations.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据