4.7 Article

Regression conformal prediction with random forests

Journal

MACHINE LEARNING
Volume 97, Issue 1-2, Pages 155-176

Publisher

SPRINGER
DOI: 10.1007/s10994-014-5453-0

Keywords

Conformal prediction; Random forests; Regression

Funding

  1. Swedish Foundation for Strategic Research through the project High-Performance Data Mining for Drug Effect Detection [IIS11-0053]
  2. Knowledge Foundation through the project Big Data Analytics by Online Ensemble Learning [20120192]

Ask authors/readers for more resources

Regression conformal prediction produces prediction intervals that are valid, i.e., the probability of excluding the correct target value is bounded by a predefined confidence level. The most important criterion when comparing conformal regressors is efficiency; the prediction intervals should be as tight (informative) as possible. In this study, the use of random forests as the underlying model for regression conformal prediction is investigated and compared to existing state-of-the-art techniques, which are based on neural networks and k-nearest neighbors. In addition to their robust predictive performance, random forests allow for determining the size of the prediction intervals by using out-of-bag estimates instead of requiring a separate calibration set. An extensive empirical investigation, using 33 publicly available data sets, was undertaken to compare the use of random forests to existing state-of-the-art conformal predictors. The results show that the suggested approach, on almost all confidence levels and using both standard and normalized nonconformity functions, produced significantly more efficient conformal predictors than the existing alternatives.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available