期刊
STATISTICAL SCIENCE
卷 28, 期 2, 页码 189-208出版社
INST MATHEMATICAL STATISTICS
DOI: 10.1214/12-STS406
关键词
Approximate Bayesian computation; dimension reduction; likelihood-free inference; regularization; variable selection
资金
- Australian Research Council [DP1092805]
- French National Research Agency [ANR-2010-JCJC-1607-01]
- Australian Research Council [DP1092805] Funding Source: Australian Research Council
Approximate Bayesian computation (ABC) methods make use of comparisons between simulated and observed summary statistics to overcome the problem of computationally intractable likelihood functions. As the practical implementation of ABC requires computations based on vectors of summary statistics, rather than full data sets, a central question is how to derive low-dimensional summary statistics from the observed data with minimal loss of information. In this article we provide a comprehensive review and comparison of the performance of the principal methods of dimension reduction proposed in the ABC literature. The methods are split into three nonmutually exclusive classes consisting of best subset selection methods, projection techniques and regularization. In addition, we introduce two new methods of dimension reduction. The first is a best subset selection method based on Akaike and Bayesian information criteria, and the second uses ridge regression as a regularization procedure. We illustrate the performance of these dimension reduction techniques through the analysis of three challenging models and data sets.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据