4.6 Article

Fuzzy support vector machine based on within-class scatter for classification problems with outliers or noises

期刊

NEUROCOMPUTING
卷 110, 期 -, 页码 101-110

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2012.11.023

关键词

Fuzzy support vector machine; Fuzzy membership; Maximal margin; Within-class scatter

资金

  1. Ministry of Science and Technology of China (863 program) [2007AA01Z203]
  2. National Basic Research Program of China (973 program) [2007CB307101]
  3. Beijing Jiaotong University [2006XZ002]
  4. Fundamental Research Funds for the Central Universities [2009JBM021]

向作者/读者索取更多资源

Support vector machine (SVM) is a popular machine learning technique, and it has been widely applied in many real-world applications. Since SVM is sensitive to outliers or noises in the dataset, Fuzzy SVM (FSVM) has been proposed. Like SVM, it still aims at finding an optimal hyperplane that can separate two classes with the maximal margin. The only difference is that fuzzy membership is assigned to each training point based on its importance, which makes it less sensitive to outliers or noises to some extent. However, FSVM ignores an important prior knowledge, the within-class structure. In this paper, we propose a new classification algorithm-FSVM with minimum within-class scatter (WCS-FSVM), which incorporates minimum within-class scatter in Fisher Discriminant Analysis (FDA) into FSVM. The main idea is that an optimal hyperplane is found such that the margin is maximized while the within-class scatter is kept as small as possible. In addition, we propose a new fuzzy membership function for WCS-FSVM. Experiments on six benchmarking datasets and four artificial datasets show that our proposed WCS-FSVM algorithm can not only improve the classification accuracy and generalization ability but also handle the classification problems with outliers or noises more effectively. (C) 2013 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据