Journal
JOURNAL OF MATHEMATICAL PSYCHOLOGY
Volume 44, Issue 1, Pages 62-91Publisher
ACADEMIC PRESS INC ELSEVIER SCIENCE
DOI: 10.1006/jmps.1999.1277
Keywords
-
Ask authors/readers for more resources
In this paper we briefly study the basic idea of Akaike's (1973) information criterion (AIC). Then, we present some recent development on a new entropic or information complexity (ICOMP) criterion of Bozdogan (1988a, 1988b, 1990, 1994d, 1996, 1998a, 1998b) for model selection. ii rationale for ICOMP as a model selection criterion is that it combines a badness-of-fit term (such as minus twice the maximum log likelihood) with a measure of complexity of a model differently than AIC, or its variants. by taking into account the interdependencies of the parameter estimates us ut ll as the dependencies of the model residuals. We operationalize the general form of ICOMP based on the quantification of the concept of overall model complexity in terms of the estimated inverse-Fisher information matrix. This approach results in an approximation to the sum of two Kullback-Leibler distances. Using the correlational form of the complexity, we further provide yet another form of ICOMP to take into account the interdependencies (i.e.. Lon correlations), among the parameter estimates of the model. Later, we illustrate the practical utility and the importance of this new model selection criterion by providing several real as well as Monte Carte simulation examples: and compare itu performance against AIC. or its variants, (C) 2000 Academic Press.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available