Journal
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS
Volume 10, Issue 9, Pages 2573-2588Publisher
SPRINGER HEIDELBERG
DOI: 10.1007/s13042-018-0892-8
Keywords
Twin support vector regression; l1; documentclass[12pt]{minimal}; usepackage{amsmath}; usepackage{wasysym}; usepackage{amsfonts}; usepackage{amssymb}; usepackage{amsbsy}; usepackage{mathrsfs}; usepackage{upgreek}; setlength{; oddsidemargin}{-69pt}; begin{document}$$l_1$$; end{document}-Norm loss; Geometric interpretation; Nearest points
Categories
Ask authors/readers for more resources
This paper proposes a novel l(1)-norm loss based twin support vector regression (l(1)-TSVR) model. The bound functions in this l(1)-TSVR are optimized by simultaneously minimizing the l(1)-norm based fitting and one-side epsilon-insensitive losses, which results in different dual problems compared with twin support vector regression (TSVR) and epsilon-TSVR. The main advantages of this l(1)-TSVR are: First, it does not need to inverse any kernel matrix in dual problems, indicating that it not only can be optimized efficiently, but also has partly sparse bound functions. Second, it has a perfect and practical geometric interpretation. In the spirit of its geometric interpretation, this paper further presents a nearest-points based l(1)-TSVR (NP-l(1)-TSVR), in which bound functions are constructed by finding the nearest points between the reduced convex/affine hulls of training data and its shifted sets, respectively. Computational results obtained on a number of synthetic and real-world benchmark datasets clearly illustrate the superiority of the proposed l(1)-TSVR and NP-l(1)-TSVR as comparable generalization performance is achieved in accordance with the other SVR-type algorithms.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available