4.7 Article

Unsupervised feature selection via transformed auto-encoder

Journal

KNOWLEDGE-BASED SYSTEMS
Volume 215, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.knosys.2021.106748

Keywords

Machine learning; Deep learning; Feature selection; Unsupervised learning; Auto-encoder

Funding

  1. National Natural Science Foundation of China [61502104, 61672159]
  2. Fujian Collaborative Innovation Center for Big Data Application in Governments
  3. Technology Innovation Platform Project of Fujian Province [2014H2005]

Ask authors/readers for more resources

Feature selection is crucial in machine learning, and the proposed unsupervised feature selection scheme based on auto-encoder can solve traditional constrained feature selection problems and adapt to various loss functions and activation functions.
As one of the fundamental research issues, feature selection plays a critical role in machine learning. By the removal of irrelevant features, it attempts to reduce computational complexities of upstream tasks, usually with computation accelerations and performance improvements. This paper proposes an auto-encoder based scheme for unsupervised feature selection. Due to the inherent consistency, this framework can solve traditional constrained feature selection problems approximately. Specifically, the proposed model takes non-negativity, orthogonality, and sparsity into account, whose internal characteristics are exploited sufficiently. It can also employ other loss functions and flexible activation functions. The former can fit a wide range of learning tasks, and the latter has the ability to play the role of regularization terms to impose regularization constraints on the model. Thereinafter, the proposed model is validated on multiple benchmark datasets, where various activation and loss functions are analyzed for finding better feature selectors. Finally, extensive experiments demonstrate the superiority of the proposed method against other compared state-of-the-arts. (C) 2021 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available