4.8 Review

Big Learning with Bayesian methods

Journal

NATIONAL SCIENCE REVIEW
Volume 4, Issue 4, Pages 627-651

Publisher

OXFORD UNIV PRESS
DOI: 10.1093/nsr/nwx044

Keywords

Big Bayesian Learning; Bayesian non-parametrics; regularized Bayesian inference; scalable algorithms

Funding

  1. National Basic Research Program of China [2013CB329403]
  2. National Natural Science Foundation of China [61620106010, 61621136008, 61332007]
  3. Youth Top-notch Talent Support Program

Ask authors/readers for more resources

The explosive growth in data volume and the availability of cheap computing resources have sparked increasing interest in Big learning, an emerging subfield that studies scalable machine learning algorithms, systems and applications with Big Data. Bayesian methods represent one important class of statistical methods for machine learning, with substantial recent developments on adaptive, flexible and scalable Bayesian learning. This article provides a survey of the recent advances in Big learning with Bayesian methods, termed Big Bayesian Learning, including non-parametric Bayesian methods for adaptively inferring model complexity, regularized Bayesian inference for improving the flexibility via posterior regularization, and scalable algorithms and systems based on stochastic subsampling and distributed computing for dealing with large-scale applications. We also provide various new perspectives on the large-scale Bayesian modeling and inference.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available