4.6 Article

Simplex basis function based sparse least squares support vector regression

Journal

NEUROCOMPUTING
Volume 330, Issue -, Pages 394-402

Publisher

ELSEVIER SCIENCE BV
DOI: 10.1016/j.neucom.2018.11.025

Keywords

Least squares support vector regression; Low rank kernels; Simplex basis function

Ask authors/readers for more resources

In this paper, a novel sparse least squares support vector regression algorithm, referred to as LSSVR-SBF, is introduced which uses a new low rank kernel based on simplex basis function, which has a set of nonlinear parameters. It is shown that the proposed model can be represented as a sparse linear regression model based on simplex basis functions. We propose a fast algorithm for least squares support vector regression solution at the cost of O (N) by avoiding direct kernel matrix inversion. An iterative estimation algorithm has been proposed to optimize the nonlinear parameters associated with the simplex basis functions with the aim of minimizing model mean square errors using the gradient descent algorithm. The proposed fast least square solution and the gradient descent algorithm are alternatively applied. Finally it is shown that the model has a dual representation as a piecewise linear model with respect to the system input. Numerical experiments are carried out to demonstrate the effectiveness of the proposed approaches. (C) 2018 Published by Elsevier B.V.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available