4.4 Article

Power-Expected-Posterior Priors as Mixtures of g-Priors in Normal Linear Models

Journal

BAYESIAN ANALYSIS
Volume 17, Issue 4, Pages 1073-1099

Publisher

INT SOC BAYESIAN ANALYSIS
DOI: 10.1214/21-BA1288

Keywords

-

Ask authors/readers for more resources

This paper explores the powerexpected-posterior (PEP) prior as a generalization to the expected-posterior prior (EPP) in objective Bayesian model selection under normal linear models. It is proven that PEP can be represented as a mixture of g-prior, providing closed-form posterior distributions and Bayes factors for computational tractability. Comparisons with other mixtures of g-prior are made and results are presented in both simulated and real-life datasets.
One of the main approaches used to construct prior distributions for objective Bayes methods is the concept of random imaginary observations. Under this setup, the expected-posterior prior (EPP) offers several advantages, among which it has a nice and simple interpretation and provides an effective way to establish compatibility of priors among models. In this paper, we study the powerexpected-posterior prior as a generalization to the EPP in objective Bayesian model selection under normal linear models. We prove that it can be represented as a mixture of g-prior, like a wide range of prior distributions under normal linear models, and thus posterior distributions and Bayes factors are derived in closed form, keeping therefore its computational tractability. Following this result, we can naturally prove that desiderata (criteria for objective Bayesian model comparison) hold for the PEP prior. Comparisons with other mixtures of g-prior are made and results are presented in simulated and real-life datasets.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.4
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available