3.8 Proceedings Paper

Deep Ensembles for Imbalanced Classification

Ask authors/readers for more resources

Most of the standard classification algorithms perform poorly when dealing with the case of imbalanced classes i.e. when there is a class to which the overwhelming majority of samples belong. There are many approaches that deal with this problem, among which SMOTE and SMOTE boosting, the common approach prefers overly simplistic models that lead to degradation of performance. Recent advances in statistical learning theory provide more adequate complexity penalties for weak classifiers, which stem from the Rademacher complexity terms in the ensemble generalization bounds. By adopting these advances and introducing a sample weight correction based on the classification margin at each iteration of boosting we get more precise models for imbalanced classification problems.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available