4.2 Article

Near-optimal Nonlinear Regression Trees

Journal

OPERATIONS RESEARCH LETTERS
Volume 49, Issue 2, Pages 201-206

Publisher

ELSEVIER
DOI: 10.1016/j.orl.2021.01.002

Keywords

Decision trees; Regression; Nonlinear optimization

Ask authors/readers for more resources

Near-optimal Nonlinear Regression Trees with hyperplane splits (NNRTs) use a polynomial prediction function in leaf nodes and are solved by stochastic gradient methods, showing convergence to the global optimal on synthetic data. When compared to other methods like ORT-LH, Multivariate Adaptive Regression Splines (MARS), Random Forests (RF), and XGBoost on 40 real-world datasets, NNRTs demonstrate superior performance overall.
We propose Near-optimal Nonlinear Regression Trees with hyperplane splits (NNRTs) that use a polynomial prediction function in the leaf nodes, which we solve by stochastic gradient methods. On synthetic data, we show experimentally that the algorithm converges to the global optimal. We compare NNRTs, ORT-LH, Multivariate Adaptive Regression Splines (MARS), Random Forests (RF) and XGBoost on 40 real-world datasets and show that overall NNRTs have a performance edge over all other methods. (C) 2021 Published by Elsevier B.V.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.2
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available