4.6 Review

How to Read Probability Distributions as Statements about Process

Journal

ENTROPY
Volume 16, Issue 11, Pages 6059-6098

Publisher

MDPI
DOI: 10.3390/e16116059

Keywords

measurement; maximum entropy; information theory; statistical mechanics; extreme value distributions; neutral theories in biology

Funding

  1. National Science Foundation [DEB-1251035]
  2. Direct For Biological Sciences [1251035] Funding Source: National Science Foundation

Ask authors/readers for more resources

Probability distributions can be read as simple expressions of information. Each continuous probability distribution describes how information changes with magnitude. Once one learns to read a probability distribution as a measurement scale of information, opportunities arise to understand the processes that generate the commonly observed patterns. Probability expressions may be parsed into four components: the dissipation of all information, except the preservation of average values, taken over the measurement scale that relates changes in observed values to changes in information, and the transformation from the underlying scale on which information dissipates to alternative scales on which probability pattern may be expressed. Information invariances set the commonly observed measurement scales and the relations between them. In particular, a measurement scale for information is defined by its invariance to specific transformations of underlying values into measurable outputs. Essentially all common distributions can be understood within this simple framework of information invariance and measurement scale.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available