4.6 Article

Cost-sensitive support vector machines

Journal

NEUROCOMPUTING
Volume 343, Issue -, Pages 50-64

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2018.11.099

Keywords

Cost-sensitive learning; Classification; Class imbalance; SVM; Bayes consistency

Ask authors/readers for more resources

Many machine learning applications involve imbalance class prior probabilities, multi-class classification with many classes (often addressed by one-versus-rest strategy), or cost-sensitive classification. In such domains, each class (or in some cases, each sample) requires special treatment. In this paper, we use a constructive procedure to extend SVM's standard loss function to optimize the classifier with respect to class imbalance or class costs. By drawing connections between risk minimization and probability elicitation, we show that the resulting classifier guarantees Bayes consistency. We further analyze the primal and the dual objective functions and derive the objective function in a regularized risk minimization framework. Finally, we extend the classifier to the with cost-sensitive learning with example dependent costs. We perform experimental analysis on class imbalance, cost-sensitive learning with given class and example costs and show that proposed algorithm provides superior generalization performance, compared to conventional methods. (C) 2019 Published by Elsevier B.V.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available