4.7 Article

An iterative algorithm learning the maximal margin classifier

Journal

PATTERN RECOGNITION
Volume 36, Issue 9, Pages 1985-1996

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/S0031-3203(03)00060-8

Keywords

pattern recognition; linear classifier; supervised learning; support vector machines; kernel functions

Ask authors/readers for more resources

A simple learning algorithm for maximal margin classifiers (also support vector machines with quadratic cost function) is proposed. We build our iterative algorithm on top of the Schlesinger-Kozinec algorithm (S-K-algorithm) from 1981 which finds a maximal margin hyperplane with a given precision for separable data. We suggest a generalization of the S-K-algorithm (i) to the non-linear case using kernel functions and (ii) for non-separable data. The requirement in memory storage is linear to the data. This property allows the proposed algorithm to be used for large training problems. The resulting algorithm is simple to implement and as the experiments showed competitive to the state-of-the-art algorithms. The implementation of the algorithm in Matlab is available. We tested the algorithm on the problem aiming at recognition poor quality numerals. (C) 2003 Pattern Recognition Society. Published by Elsevier Science Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available