4.5 Article

Slack-Factor-Based Fuzzy Support Vector Machine for Class Imbalance Problems

Journal

Publisher

ASSOC COMPUTING MACHINERY
DOI: 10.1145/3579050

Keywords

Cost-sensitive learning; class imbalance; fuzzy support vector machine; decision hyperplane; fuzzy membership function

Ask authors/readers for more resources

Class imbalance and noisy data pose challenges for constructing good classifiers using SVM. Fuzzy SVMs (FSVMs) address these issues by using fuzzy membership functions and cost-sensitive learning. However, the accuracy of FSVMs is affected by class imbalance. To overcome this, we propose SFFSVM, which incorporates a new fuzzy membership function and adjusts the importance of samples based on the relationship between estimated and optimal hyperplanes. Experimental results show that SFFSVM outperforms other methods on F1, MCC, and AUC-PR metrics.
Class imbalance and noisy data widely exist in real-world problems, and the support vector machine (SVM) is hard to construct good classifiers on these data. Fuzzy SVMs (FSVMs), as variants of SVM, use a fuzzy membership function both to reflect the samples' importance and to remove the impact of noises, and employ cost-sensitive technology to address the class imbalance. They can handle the noise and class imbalance problems in many cases; however, the fuzzy membership functions are often affected by the class imbalance data, leading to inaccurate measures for samples' performance and affecting the performance of FSVMs. To solve this problem, we design a new fuzzy membership function and combine it with cost-sensitive learning to deal with the class imbalance problem with noisy data, named Slack-Factor-based FSVM (SFFSVM). In SFFSVM, the relative distances between samples and an estimated hyperplane, called slack factors, are used to define the fuzzy membership function. To eliminate the impact of class imbalance on the function and gain more accurate samples' importance, we rectify the importance according to the positional relationship between the estimated hyperplane and the optimal hyperplane of the problem, and the slack factors of samples. Comprehensive experiments on artificial and real-world datasets demonstrate that SFFSVM outperforms other comparative methods on F1, MCC, and AUC-PR metrics.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available