Journal
INTERNATIONAL JOURNAL OF INFORMATION TECHNOLOGY & DECISION MAKING
Volume 6, Issue 4, Pages 671-686Publisher
WORLD SCIENTIFIC PUBL CO PTE LTD
DOI: 10.1142/S0219622007002733
Keywords
feature selection; Support Vector Machine; credit assessment
Ask authors/readers for more resources
In many applications such as credit risk management, data are represented as high-dimensional feature vectors. It makes the feature selection necessary to reduce the computational complexity, improve the generalization ability and the interpretability. In this paper, we present a novel feature selection method -Least Squares Support Feature Machine (LS-SFM). The proposed method has two advantages comparing with conventional Support Vector Machine (SVM) and LS-SVM. First, the convex combinations of basic kernels are used as the kernel and each basic kernel makes use of a single feature. It transforms the feature selection problem that cannot be solved in the context of SVM to an ordinary multiple-parameter learning problem. Second, all parameters are learned by a two stage iterative algorithm. A 1-norm based regularized cost function is used to enforce sparseness of the feature parameters. The support features refer to the respective features with nonzero feature parameters. Experimental study on some of the UCI datasets and a commercial credit card dataset demonstrates the effectiveness and efficiency of the proposed approach.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available