4.5 Article

GP-BART: A novel Bayesian additive regression trees approach using Gaussian processes

Journal

COMPUTATIONAL STATISTICS & DATA ANALYSIS
Volume 190, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.csda.2023.107858

Keywords

Bayesian additive regression trees; Gaussian process; Probabilistic machine learning; Treed Gaussian process

Ask authors/readers for more resources

The Bayesian additive regression trees (BART) model is a powerful ensemble method for regression tasks, but its lack of smoothness and explicit covariance structure can limit its performance. The Gaussian processes Bayesian additive regression trees (GP-BART) model addresses this limitation by incorporating Gaussian process priors, resulting in superior performance in various scenarios.
The Bayesian additive regression trees (BART) model is an ensemble method extensively and successfully used in regression tasks due to its consistently strong predictive performance and its ability to quantify uncertainty. BART combines weak tree models through a set of shrinkage priors, whereby each tree explains a small portion of the variability in the data. However, the lack of smoothness and the absence of an explicit covariance structure over the observations in standard BART can yield poor performance in cases where such assumptions would be necessary. The Gaussian processes Bayesian additive regression trees (GP-BART) model is an extension of BART which addresses this limitation by assuming Gaussian process (GP) priors for the predictions of each terminal node among all trees. The model's effectiveness is demonstrated through applications to simulated and real-world data, surpassing the performance of traditional modelling approaches in various scenarios.(c) 2023 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY license (http://creativecommons .org /licenses /by /4 .0/).

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available