期刊
PATTERN RECOGNITION LETTERS
卷 34, 期 12, 页码 1394-1404出版社
ELSEVIER
DOI: 10.1016/j.patrec.2013.04.023
关键词
Support vector machines; Online algorithms; Kernel methods; Regression problem; Orthogonal regression
In this paper, we introduce a new online algorithm for orthogonal regression. The method is constructed via an stochastic gradient descent approach combined with the idea of a tube loss function, which is similar to the one used in support vector (SV) regression. The algorithm can be used in primal or in dual variables. The latter formulation allows the introduction of kernels and soft margins. In addition, an incremental strategy algorithm is introduced, which can be used to find sparse solutions and also an approximation to the minimal tube containing the data. The algorithm is very simple to implement and avoids quadratic optimization. (c) 2013 Published by Elsevier B.V.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据