4.2 Article

The Akaike information criterion: Background, derivation, properties, application, interpretation, and refinements

出版社

WILEY
DOI: 10.1002/wics.1460

关键词

AIC; Kullback-Leibler information; model selection criterion

向作者/读者索取更多资源

The Akaike information criterion (AIC) is one of the most ubiquitous tools in statistical modeling. The first model selection criterion to gain widespread acceptance, AIC was introduced in 1973 by Hirotugu Akaike as an extension to the maximum likelihood principle. Maximum likelihood is conventionally applied to estimate the parameters of a model once the structure and dimension of the model have been formulated. Akaike's seminal idea was to combine into a single procedure the process of estimation with structural and dimensional determination. This article reviews the conceptual and theoretical foundations for AIC, discusses its properties and its predictive interpretation, and provides a synopsis of important practical issues pertinent to its application. Comparisons and delineations are drawn between AIC and its primary competitor, the Bayesian information criterion (BIC). In addition, the article covers refinements of AIC for settings where the asymptotic conditions and model specification assumptions that underlie the justification of AIC may be violated. This article is categorized under: Software for Computational Statistics > Artificial Intelligence and Expert Systems Statistical Models > Model Selection Statistical and Graphical Methods of Data Analysis > Modeling Methods and Algorithms Statistical and Graphical Methods of Data Analysis > Information Theoretic Methods

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.2
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据