Journal
ENTROPY
Volume 10, Issue 3, Pages 261-273Publisher
MDPI
DOI: 10.3390/e10030261
Keywords
Shannon entropy; Kullback I-divergence; Renyi information measures; f-divergence; f-entropy; functional equation; proper score; maximum entropy; transitive inference rule; Bregman distance
Categories
Funding
- Hungarian Research Grant OTKA [T046376]
Ask authors/readers for more resources
Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1,...,N} representable by joint entropies of components of an N-dimensional random vector. (C) Axiomatic characterization of MaxEnt and related inference rules. The paper concludes with a brief discussion of the relevance of the axiomatic approach for information theory.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available