4.2 Review

Generalised information and entropy measures in physics

Journal

CONTEMPORARY PHYSICS
Volume 50, Issue 4, Pages 495-510

Publisher

TAYLOR & FRANCIS LTD
DOI: 10.1080/00107510902823517

Keywords

measures of information; entropy; generalised statistical mechanics; complex systems

Ask authors/readers for more resources

The formalism of statistical mechanics can be generalised by starting from more general measures of information than the Shannon entropy and maximising those subject to suitable constraints. We discuss some of the most important examples of information measures that are useful for the description of complex systems. Examples treated are the Renyi entropy, Tsallis entropy, Abe entropy, Kaniadakis entropy, Sharma-Mittal entropies, and a few more. Important concepts such as the axiomatic foundations, composability and Lesche stability of information measures are briefly discussed. Potential applications in physics include complex systems with long-range interactions and metastable states, scattering processes in particle physics, hydrodynamic turbulence, defect turbulence, optical lattices, and quite generally driven nonequilibrium systems with fluctuations of temperature.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.2
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available