4.6 Article

Non-parallel support vector classifiers with different loss functions

Journal

NEUROCOMPUTING
Volume 143, Issue -, Pages 294-301

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2014.05.063

Keywords

Non-parallel classifiers; Least squares loss; Pinball loss; Hinge loss; Kernel trick

Funding

  1. European Research Council under the European Union's Seventh Framework Programme (FP7) / ERC AdG A-DATADRIVE-B
  2. Research Council KUL [GOA/10/09 MaNet, CoE PFV/10/002, BIL12/11T]
  3. PhD/Postdoc grants Flemish Government
  4. FWO [G.0377.12, G.088114N]
  5. IWT [100031]
  6. Belgian Federal Science Policy Office [IUAP P7/19]

Ask authors/readers for more resources

This paper introduces a general framework of non-parallel support vector machines, which involves a regularization term, a scatter loss and a misclassification loss. When dealing with binary problems, the framework with proper losses covers some existing non-parallel classifiers, such as multisurface proximal support vector machine via generalized eigenvalues, twin support vector machines, and its least squares version. The possibility of incorporating different existing scatter and misclassification loss functions into the general framework is discussed. Moreover, in contrast with the mentioned methods, which applies kernel-generated surface, we directly apply the kernel trick in the dual and then obtain nonparametric models. Therefore, one does not need to formulate two different primal problems for the linear and nonlinear kernel respectively. In addition, experimental results are given to illustrate the performance of different loss functions. (C) 2014 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available