4.5 Article

Laws of evolution parallel the laws of thermodynamics

Journal

JOURNAL OF CHEMICAL THERMODYNAMICS
Volume 124, Issue -, Pages 141-148

Publisher

ACADEMIC PRESS LTD- ELSEVIER SCIENCE LTD
DOI: 10.1016/j.jct.2018.05.005

Keywords

Information; Disorder; Order; Entropy; Statistical mechanics; Microstate; Sequence

Ask authors/readers for more resources

We hypothesize that concepts from thermodynamics and statistical mechanics can be used to define summary statistics, similar to thermodynamic entropy, to summarize the convergence of processes driven by random inputs subject to deterministic constraints. The primary example used here is biological evolution. We propose that evolution of biological structures and behaviors is driven by the ability of living organisms to acquire, store, and act on information and that summary statistics can be developed to provide a stochastically deterministic information theory for biological evolution. The statistical concepts that are the basis of thermodynamic entropy are also true for information, and we show that adaptation and evolution have a specific deterministic direction arising from many random events. Therefore, an information theory formulated on the same foundation as the immensely powerful concepts used in statistical mechanics will provide statistics, similar to thermodynamic entropy, that summarize distribution functions for environmental properties and organism performance. This work thus establishes foundational principles for a quantitative theory that encompasses both behavioral and biological evolution and may be extended to other fields such as economics, market dynamics and health systems. (C) 2018 Elsevier Ltd.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available