Journal
PATTERN RECOGNITION
Volume 46, Issue 9, Pages 2531-2537Publisher
ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2013.02.007
Keywords
Feature selection; SVM; DrSVM; Sparse learning
Funding
- Natural Science Foundations of China [61070239, 61272303]
Ask authors/readers for more resources
Support vector machine (SVM) is the state-of-the-art classification method, and the doubly regularized SVM (DrSVM) is an important extension based on the elastic net penalty. DrSVM has been successfully applied in handling variable selection while retaining (or discarding) correlated variables. However, it is challenging to solve this model. In this paper we develop an iterative l(2)-SVM approach to implement DrSVM over high-dimensional datasets. Our approach can significantly reduce the computation complexity. Moreover, the corresponding algorithms have global convergence property. Empirical results over the simulated and real-world gene datasets are encouraging. (C) 2013 Elsevier Ltd. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available