4.2 Article

The Akaike information criterion: Background, derivation, properties, application, interpretation, and refinements

Publisher

WILEY
DOI: 10.1002/wics.1460

Keywords

AIC; Kullback-Leibler information; model selection criterion

Ask authors/readers for more resources

The Akaike information criterion (AIC) is one of the most ubiquitous tools in statistical modeling. The first model selection criterion to gain widespread acceptance, AIC was introduced in 1973 by Hirotugu Akaike as an extension to the maximum likelihood principle. Maximum likelihood is conventionally applied to estimate the parameters of a model once the structure and dimension of the model have been formulated. Akaike's seminal idea was to combine into a single procedure the process of estimation with structural and dimensional determination. This article reviews the conceptual and theoretical foundations for AIC, discusses its properties and its predictive interpretation, and provides a synopsis of important practical issues pertinent to its application. Comparisons and delineations are drawn between AIC and its primary competitor, the Bayesian information criterion (BIC). In addition, the article covers refinements of AIC for settings where the asymptotic conditions and model specification assumptions that underlie the justification of AIC may be violated. This article is categorized under: Software for Computational Statistics > Artificial Intelligence and Expert Systems Statistical Models > Model Selection Statistical and Graphical Methods of Data Analysis > Modeling Methods and Algorithms Statistical and Graphical Methods of Data Analysis > Information Theoretic Methods

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.2
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available