Journal
2017 16TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA)
Volume -, Issue -, Pages 908-913Publisher
IEEE
DOI: 10.1109/ICMLA.2017.00-39
Keywords
-
Ask authors/readers for more resources
Most of the standard classification algorithms perform poorly when dealing with the case of imbalanced classes i.e. when there is a class to which the overwhelming majority of samples belong. There are many approaches that deal with this problem, among which SMOTE and SMOTE boosting, the common approach prefers overly simplistic models that lead to degradation of performance. Recent advances in statistical learning theory provide more adequate complexity penalties for weak classifiers, which stem from the Rademacher complexity terms in the ensemble generalization bounds. By adopting these advances and introducing a sample weight correction based on the classification margin at each iteration of boosting we get more precise models for imbalanced classification problems.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available