4.1 Article

Posterior probability support vector machines for unbalanced data

Journal

IEEE TRANSACTIONS ON NEURAL NETWORKS
Volume 16, Issue 6, Pages 1561-1573

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/tnn.2005.857955

Keywords

Bayesian decision theory; classification; margin; maximal margin algorithms; v-SVM; posterior probability; support vector machines (SVMs); unbalanced data

Ask authors/readers for more resources

This paper proposes a complete framework of posterior probability support vector machines (PPSVMs) for weighted training samples using modified concepts of risks, linear separability, margin, and optimal hyperplane. Within this framework, a new optimization problem for unbalanced classification problems is formulated and a new concept of support vectors established. Furthermore, a soft PPSVM with an interpretable parameter nu is obtained which is similar to the nu-SVM developed by Scholkopf et al., and an empirical method for determining the posterior probability is proposed as a new approach to determine nu. The main advantage of an PPSVM classifier lies in that fact that it is closer to the Bayes optimal without knowing the distributions. To validate the proposed method, two synthetic classification examples are used to illustrate the logical correctness of PPSVMs and their relationship to regular SVMs and Bayesian methods. Several other classification experiments are conducted to demonstrate that the performance of PPSVMs is better than regular SVMs in some cases. Compared with fuzzy support vector machines (FSVMs), the proposed PPSVM is a natural and an analytical extension of regular SVMs based on the statistical learning theory.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.1
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available