3.9 Article

Shannon entropy: a rigorous notion at the crossroads between probability, information theory, dynamical systems and statistical physics

Journal

Publisher

CAMBRIDGE UNIV PRESS
DOI: 10.1017/S0960129512000783

Keywords

-

Funding

  1. Agence Nationale de la Recherche, SYSCOMM program [DISCO 09-SYSC-003]
  2. Institut National de la Sante et de la Recherche Medicale [MICROMEGAS PC201104]

Ask authors/readers for more resources

Statistical entropy was introduced by Shannon as a basic concept in information theory measuring the average missing information in a random source. Extended into an entropy rate, it gives bounds in coding and compression theorems. In this paper, I describe how statistical entropy and entropy rate relate to other notions of entropy that are relevant to probability theory (entropy of a discrete probability distribution measuring its unevenness), computer sciences (algorithmic complexity), the ergodic theory of dynamical systems (Kolmogorov-Sinai or metric entropy) and statistical physics (Boltzmann entropy). Their mathematical foundations and correlates (the entropy concentration, Sanov, Shannon-McMillan-Breiman, Lempel-Ziv and Pesin theorems) clarify their interpretation and offer a rigorous basis for maximum entropy principles. Although often ignored, these mathematical perspectives give a central position to entropy and relative entropy in statistical laws describing generic collective behaviours, and provide insights into the notions of randomness, typicality and disorder. The relevance of entropy beyond the realm of physics, in particular for living systems and ecosystems, is yet to be demonstrated.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.9
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available