期刊
NEUROCOMPUTING
卷 144, 期 -, 页码 174-183出版社
ELSEVIER
DOI: 10.1016/j.neucom.2014.05.040
关键词
Twin Support Vector Machine; Least Squares Projection Twin Support; Vector Machine; Feature selection
资金
- Jiangsu Key Laboratory of Image and Video Understanding for Social Safety (Nanjing University of Science and Technology) [30920130122006]
- China Postdoctoral Science Foundation [2014M551599]
- National Natural Science Foundation of China [61272220, 61101197]
- Natural Science Foundation of Jiangsu Province of China [BK2012399]
In this paper, we propose a new feature selection approach for the recently proposed Least Squares Projection Twin Support Vector Machine (LSPTSVM) for binary classification. 1-norm is used in our feature selection objective so that only non-zero elements in weight vectors will be chosen as selected features. Also, the Tikhonov regularization term is incorporated to the objective of our approach to reduce the singularity problems of Quadratic Programming Problems (QPPs), and then to minimize its 1-norm measure. This approach leads to a strong feature suppression capability, called as Feature Selection for Least Squares Projection Twin Support Vector Machine (FLSPTSVM). The solutions of FLSPTSVM can be obtained by solving two smaller QPPS arising from two primal QPPs as opposed to two dual ones in Twin Support Vector Machine (TWSVM). Thus, FLSPTSVM is capable of generating sparse solutions. This means that FLSPTSVM can reduce the number of input features for a linear case. Our linear FLSPTSVM can also be extended to a nonlinear case with the kernel trick. When a nonlinear classifier is used, the number of kernel functions required for the classifier is reduced. Our experiments on publicly available datasets demonstrate that our FLSPTSVM has comparable classification accuracy to that of LSPTSVM and obtains sparse solutions. (C) 2014 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据