4.7 Article

A Surrogate-Assisted Evolutionary Feature Selection Algorithm With Parallel Random Grouping for High-Dimensional Classification

期刊

IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION
卷 26, 期 5, 页码 1087-1101

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TEVC.2022.3149601

关键词

Optimization; Computational modeling; Search problems; Feature extraction; Evolutionary computation; Classification algorithms; Training; High-dimensional feature selection~(FS); random grouping; sampling strategy; surrogate-assisted EA~(SAEA)

资金

  1. National Natural Science Foundation of China [61976165]

向作者/读者索取更多资源

This study proposes a surrogate-assisted evolutionary algorithm (SAEA) for expensive feature selection problems. By employing parallel random grouping and a constraint-based sampling strategy, the algorithm effectively optimizes high-dimensional discrete decision variables. Experimental results demonstrate that the proposed algorithm outperforms traditional and ensemble feature selection methods on multiple datasets.
Various evolutionary algorithms (EAs) have been proposed to address feature selection (FS) problems, in which a large number of fitness evaluations are needed. With the rapid growth of data scales, the fitness evaluation becomes time consuming, which makes FS problems expensive optimization problems. Surrogate-assisted EAs (SAEAs) have been widely used to solve expensive optimization problems. However, the SAEAs still face difficulties in solving expensive FS problems due to their high-dimensional discrete decision variables. To address this issue, we propose an SAEA with parallel random grouping for expensive FS problems, in which three main components consist. First, a constraint-based sampling strategy is proposed, which considers the influence of the constraint boundary and the number of selected features. Second, a high-dimensional FS problem is randomly divided into several low-dimensional subproblems. Surrogate models are then constructed in these low-dimensional decision spaces. After that, all the subproblems are optimized in parallel. The process of random grouping and parallel optimization continues until the termination condition is met. Finally, a final solution is chosen from the best solution in the historical search and the best solution in the last population using a random, distance-, or voting-based method. Experimental results show that the proposed algorithm generally outperforms traditional, ensemble, and evolutionary FS methods on 14 datasets with up to 10 000 features, especially when the required number of real fitness evaluations is limited.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据