4.6 Article

Axiomatic Characterizations of Information Measures

Journal

ENTROPY
Volume 10, Issue 3, Pages 261-273

Publisher

MDPI
DOI: 10.3390/e10030261

Keywords

Shannon entropy; Kullback I-divergence; Renyi information measures; f-divergence; f-entropy; functional equation; proper score; maximum entropy; transitive inference rule; Bregman distance

Funding

  1. Hungarian Research Grant OTKA [T046376]

Ask authors/readers for more resources

Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1,...,N} representable by joint entropies of components of an N-dimensional random vector. (C) Axiomatic characterization of MaxEnt and related inference rules. The paper concludes with a brief discussion of the relevance of the axiomatic approach for information theory.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available