4.1 Article

On the association between a random parameter and an observable

Journal

TEST
Volume 13, Issue 1, Pages 85-111

Publisher

SPRINGER
DOI: 10.1007/BF02603002

Keywords

odds ratio; expected information; expected utility; hypothesis testing; model choice; model complexity

Ask authors/readers for more resources

The association between an observable and a random parameter characterizes their joint distribution given the marginal distributions. It has been shown to be incorporated in the (log-)odds ratio function. The association is inherent in each of the conditional distributions and hence determines the learning process formalized in Bayes' theorem. The paper focuses on two applications. Commonly used measures of dependence; especially Kullback-Leibler distances between densities of interest are identified and interpreted as expected values of the log-odds ratio function. Frequently Bayesian inference is based on the maximization of an expected utility. If the utility of a probability density is defined by the logarithmic score function; the expected utility can often be decomposed approximately into a term of fit and a term of model complexity. The log-odds ratio parameterization of probability densities reveals that model complexity again can be defined as an expected value of the log-odds ratio function; i.e., as a measure of dependence between the observable and the random parameter. The ideas are illustrated throughout with examples from the class of conjugate exponential families.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.1
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available