4.6 Article

Information indices: unification and applications

Journal

JOURNAL OF ECONOMETRICS
Volume 107, Issue 1-2, Pages 17-40

Publisher

ELSEVIER SCIENCE SA
DOI: 10.1016/S0304-4076(01)00111-7

Keywords

entropy; Kullback-Leibler information; model fitting; covariate; influence

Ask authors/readers for more resources

The unified framework of information theoretic statistics was established by Kullback (1959). Since then numerous information indices have been developed in various contexts. This paper represents many of these indices in a unified context. The unification thread is the discrimination information function: information indices are all logarithmic measures of discrepancy between two probability distributions. First, we present a summary of informational aspects of the basic information functions, a unification of various information-theoretic modeling approaches, and some explication in terms of traditional measures. We then tabulate a unified representation of assortments of information indices developed in the literature for maximum entropy modeling, covariate information, and influence diagnostics. The subjects of these indices include parametric model fitting, nonparametric entropy estimation, categorical data analysis, the linear and exponential family regression, and time series. The coverage however, is not exhaustive. The tabulation includes sampling theory and Bayesian indices, but the focus is on interpretation as descriptive measures and inferential properties are noted tangentially. Finally, applications of some information indices are illustrated through modeling duration data for Sprint's churned customer and choice of long distance provider. (C) 2002 Elsevier Science B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available