Journal
NEUROCOMPUTING
Volume 331, Issue -, Pages 40-49Publisher
ELSEVIER
DOI: 10.1016/j.neucom.2018.11.024
Keywords
Robustness; Imbalanced classification; Bayes optimal classifier; Fisher consistency; Half quadratic optimization
Categories
Funding
- National Natural Science Foundation of China [11471010, 11271367]
- Chinese Universities Scientific Fund
Ask authors/readers for more resources
Based on minimizing misclassification cost, a new robust loss function is designed in this paper to deal with the imbalanced classification problem under noise environment. It is nonconvex but maintains Fisher consistency. Applying the proposed loss function into support vector machine (SVM), a robust SVM framework is presented which results in a Bayes optimal classifier. However, nonconvexity makes the model difficult to optimize. We develop an alternative iterative algorithm to solve the proposed model. What's more, we analyze the robustness of the proposed model theoretically from a re-weighted SVM viewpoint and the obtained optimal solution is consistent with Bayesian optimal decision rule. Further-more, numerical experiments are carried out on databases that are drawn from UCI Machine Learning Repository and a practical application. With two different types of noise environments, one with label noise and one with feature noise, experiment results show that on these two databases the proposed method achieves better generalization results compared to other SVM methods. (C) 2018 Elsevier B.V. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available