Journal
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS
Volume 11, Issue 7, Pages 1405-1421Publisher
SPRINGER HEIDELBERG
DOI: 10.1007/s13042-019-01046-w
Keywords
Feature selection; Subspace learning; Linear independence; Basis of features; Microarray datasets
Categories
Ask authors/readers for more resources
Most of existing research works in the field of feature selection via matrix factorization techniques have been employed for unsupervised learning problems. This paper introduces a new framework for the supervised feature selection, called supervised feature selection by constituting a basis for the original space of features and matrix factorization (SFS-BMF). To this end, SFS-BMF is a guided search to find a basis for the original space of features that inherently contains linearly independent features and can be replaced with the original space. For finding the best subset of features regarding the class attribute, information gain is utilized for the process of constructing a basis. In fact, a basis for the original features is constructed according to the most informative features in terms of the information gain. Then, this basis is decomposed through a matrix factorization form in order to select a subset of features. Our proposed method guarantees the maximum relevancy of selected features to the output by using the information gain while simultaneously secures the minimum redundancy among them based on the linear independence property. Several experiments on high-dimensional microarray datasets are conducted for illustrating the efficiency of SFS-BMF. The experimental results show that the proposed SFS-BMF method outperforms some state-of-the-art feature selection methods with respect to classification performance and also according to the computational complexity.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available