4.6 Article

An Axiomatic Characterization of Mutual Information

Journal

ENTROPY
Volume 25, Issue 4, Pages -

Publisher

MDPI
DOI: 10.3390/e25040663

Keywords

shannon theory; information measures; mutual information

Ask authors/readers for more resources

We define mutual information as a map on ordered pairs of discrete random variables satisfying a set of axioms, similar to the characterization of Shannon entropy by Faddeev. Our characterization introduces a new axiom based on the concept of a Markov triangle, which represents a composition of communication channels where conditional entropy acts functorially. Our proofs do not involve logarithms.
We characterize mutual information as the unique map on ordered pairs of discrete random variables satisfying a set of axioms similar to those of Faddeev's characterization of the Shannon entropy. There is a new axiom in our characterization, however, which has no analog for Shannon entropy, based on the notion of a Markov triangle, which may be thought of as a composition of communication channels for which conditional entropy acts functorially. Our proofs are coordinate-free in the sense that no logarithms appear in our calculations.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available