4.7 Review

Information geometry for multiparameter models: new perspectives on the origin of simplicity

Journal

REPORTS ON PROGRESS IN PHYSICS
Volume 86, Issue 3, Pages -

Publisher

IOP Publishing Ltd
DOI: 10.1088/1361-6633/aca6f8

Keywords

information geometry; differential geometry; Bayesian; prior; hyperribbon; model reduction; emergence

Ask authors/readers for more resources

Complex models in various fields often have parameter ambiguity, where the parameters of the model are not well determined by the predictions for collective behavior. This review uses information geometry to explore the concept of sloppiness and its connection to emergent theories. The review discusses the structure of the model manifold and how it can explain why only a few parameter combinations matter for behavior. It also introduces methods for finding simpler models on nearby boundaries of the model manifold and discusses Bayesian priors that favor simpler models.
Complex models in physics, biology, economics, and engineering are often sloppy, meaning that the model parameters are not well determined by the model predictions for collective behavior. Many parameter combinations can vary over decades without significant changes in the predictions. This review uses information geometry to explore sloppiness and its deep relation to emergent theories. We introduce the model manifold of predictions, whose coordinates are the model parameters. Its hyperribbon structure explains why only a few parameter combinations matter for the behavior. We review recent rigorous results that connect the hierarchy of hyperribbon widths to approximation theory, and to the smoothness of model predictions under changes of the control variables. We discuss recent geodesic methods to find simpler models on nearby boundaries of the model manifold-emergent theories with fewer parameters that explain the behavior equally well. We discuss a Bayesian prior which optimizes the mutual information between model parameters and experimental data, naturally favoring points on the emergent boundary theories and thus simpler models. We introduce a 'projected maximum likelihood' prior that efficiently approximates this optimal prior, and contrast both to the poor behavior of the traditional Jeffreys prior. We discuss the way the renormalization group coarse-graining in statistical mechanics introduces a flow of the model manifold, and connect stiff and sloppy directions along the model manifold with relevant and irrelevant eigendirections of the renormalization group. Finally, we discuss recently developed 'intensive' embedding methods, allowing one to visualize the predictions of arbitrary probabilistic models as low-dimensional projections of an isometric embedding, and illustrate our method by generating the model manifold of the Ising model.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available