4.5 Article

An objective Bayes factor with improper priors

期刊

出版社

ELSEVIER
DOI: 10.1016/j.csda.2021.107404

关键词

Bayes factor; Fisher distance; Fractional Bayes factor; Intrinsic Bayes factor; Objective Bayes

向作者/读者索取更多资源

This paper presents a new perspective on the use of improper priors in Bayes factors for model comparison. It introduces an alternative approach that establishes the value of the constant in the Bayes factor by matching divergences between density functions. This method, unlike existing ones, does not require any input from the experimenter and is fully automated.
A new look at the use of improper priors in Bayes factors for model comparison is presented. As is well known, in such a case, the Bayes factor is only defined up to an arbitrary constant. Most current methods overcome the problem by using part of the sample to train the Bayes factor (Fractional Bayes Factor) or to transform the improper prior in to a proper distribution (Intrinsic Bayes Factors) and use the remainder of the sample for the model comparison. It is provided an alternative approach which relies on matching divergences between density functions so as to establish a value for the constant appearing in the Bayes factor. These are the Kullback-Leibler divergence and the Fisher information divergence; the latter being crucial as it does not depend on an unknown normalizing constant. Demonstrations of the performance of the proposed method are provided through numerous illustrations and comparisons, showing that the main advantage over existing ones is that it does not require any input from the experimenter; it is fully automated. (C) 2021 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据