4.7 Article

Unsupervised feature selection via transformed auto-encoder

期刊

KNOWLEDGE-BASED SYSTEMS
卷 215, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.knosys.2021.106748

关键词

Machine learning; Deep learning; Feature selection; Unsupervised learning; Auto-encoder

资金

  1. National Natural Science Foundation of China [61502104, 61672159]
  2. Fujian Collaborative Innovation Center for Big Data Application in Governments
  3. Technology Innovation Platform Project of Fujian Province [2014H2005]

向作者/读者索取更多资源

Feature selection is crucial in machine learning, and the proposed unsupervised feature selection scheme based on auto-encoder can solve traditional constrained feature selection problems and adapt to various loss functions and activation functions.
As one of the fundamental research issues, feature selection plays a critical role in machine learning. By the removal of irrelevant features, it attempts to reduce computational complexities of upstream tasks, usually with computation accelerations and performance improvements. This paper proposes an auto-encoder based scheme for unsupervised feature selection. Due to the inherent consistency, this framework can solve traditional constrained feature selection problems approximately. Specifically, the proposed model takes non-negativity, orthogonality, and sparsity into account, whose internal characteristics are exploited sufficiently. It can also employ other loss functions and flexible activation functions. The former can fit a wide range of learning tasks, and the latter has the ability to play the role of regularization terms to impose regularization constraints on the model. Thereinafter, the proposed model is validated on multiple benchmark datasets, where various activation and loss functions are analyzed for finding better feature selectors. Finally, extensive experiments demonstrate the superiority of the proposed method against other compared state-of-the-arts. (C) 2021 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据