4.6 Article

Information indices: unification and applications

期刊

JOURNAL OF ECONOMETRICS
卷 107, 期 1-2, 页码 17-40

出版社

ELSEVIER SCIENCE SA
DOI: 10.1016/S0304-4076(01)00111-7

关键词

entropy; Kullback-Leibler information; model fitting; covariate; influence

向作者/读者索取更多资源

The unified framework of information theoretic statistics was established by Kullback (1959). Since then numerous information indices have been developed in various contexts. This paper represents many of these indices in a unified context. The unification thread is the discrimination information function: information indices are all logarithmic measures of discrepancy between two probability distributions. First, we present a summary of informational aspects of the basic information functions, a unification of various information-theoretic modeling approaches, and some explication in terms of traditional measures. We then tabulate a unified representation of assortments of information indices developed in the literature for maximum entropy modeling, covariate information, and influence diagnostics. The subjects of these indices include parametric model fitting, nonparametric entropy estimation, categorical data analysis, the linear and exponential family regression, and time series. The coverage however, is not exhaustive. The tabulation includes sampling theory and Bayesian indices, but the focus is on interpretation as descriptive measures and inferential properties are noted tangentially. Finally, applications of some information indices are illustrated through modeling duration data for Sprint's churned customer and choice of long distance provider. (C) 2002 Elsevier Science B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据