4.0 Article

Weighted Bayesian bootstrap for scalable posterior distributions

Publisher

WILEY
DOI: 10.1002/cjs.11570

Keywords

Deep learning; Markov chain Monte Carlo; regularization; trend filtering; weighted bootstrap

Ask authors/readers for more resources

The Weighted Bayesian Bootstrap (WBB) method is introduced and developed for machine learning and statistics, providing uncertainty quantification by sampling from a high dimensional posterior distribution. The WBB is computationally fast and scalable using off-the-shelf optimization software, and has theoretical justification under suitable regularity conditions on the statistical model. Applications in regularized regression, trend filtering, and deep learning are demonstrated, with suggestions for future research directions.
We introduce and develop a weighted Bayesian bootstrap (WBB) for machine learning and statistics. WBB provides uncertainty quantification by sampling from a high dimensional posterior distribution. WBB is computationally fast and scalable using only off-the-shelf optimization software. First-order asymptotic analysis provides a theoretical justification under suitable regularity conditions on the statistical model. We illustrate the proposed methodology in regularized regression, trend filtering and deep learning and conclude with directions for future research.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.0
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available