4.8 Article

Are Gibbs-Type Priors the Most Natural Generalization of the Dirichlet Process?

出版社

IEEE COMPUTER SOC
DOI: 10.1109/TPAMI.2013.217

关键词

Bayesian nonparametrics; clustering; consistency; dependent process; discrete nonparametric prior; exchangeable partition probability function; Gibbs-type prior; Pitman-Yor process; mixture model; population genetics; predictive distribution; species sampling

资金

  1. European Research Council (ERC) [StG N-BNP 306406]
  2. CONACYT [131179]

向作者/读者索取更多资源

Discrete random probability measures and the exchangeable random partitions they induce are key tools for addressing a variety of estimation and prediction problems in Bayesian inference. Here we focus on the family of Gibbs-type priors, a recent elegant generalization of the Dirichlet and the Pitman-Yor process priors. These random probability measures share properties that are appealing both from a theoretical and an applied point of view: (i) they admit an intuitive predictive characterization justifying their use in terms of a precise assumption on the learning mechanism; (ii) they stand out in terms of mathematical tractability; (iii) they include several interesting special cases besides the Dirichlet and the Pitman-Yor processes. The goal of our paper is to provide a systematic and unified treatment of Gibbs-type priors and highlight their implications for Bayesian nonparametric inference. We deal with their distributional properties, the resulting estimators, frequentist asymptotic validation and the construction of time-dependent versions. Applications, mainly concerning mixture models and species sampling, serve to convey the main ideas. The intuition inherent to this class of priors and the neat results they lead to make one wonder whether it actually represents the most natural generalization of the Dirichlet process.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据