4.2 Article

Hybrid Gradient Descent Grey Wolf Optimizer for Optimal Feature Selection

期刊

BIOMED RESEARCH INTERNATIONAL
卷 2021, 期 -, 页码 -

出版社

HINDAWI LTD
DOI: 10.1155/2021/2555622

关键词

-

向作者/读者索取更多资源

Feature selection is the process of reducing the number of features in a dataset by removing redundant, irrelevant, and randomly class-corrected data features. By applying feature selection, data complexity can be reduced and training time can be shortened, the proposed optimizer showed promise in balancing the objectives in feature selection.
Feature selection is the process of decreasing the number of features in a dataset by removing redundant, irrelevant, and randomly class-corrected data features. By applying feature selection on large and highly dimensional datasets, the redundant features are removed, reducing the complexity of the data and reducing training time. The objective of this paper was to design an optimizer that combines the well-known metaheuristic population-based optimizer, the grey wolf algorithm, and the gradient descent algorithm and test it for applications in feature selection problems. The proposed algorithm was first compared against the original grey wolf algorithm in 23 continuous test functions. The proposed optimizer was altered for feature selection, and 3 binary implementations were developed with final implementation compared against the two implementations of the binary grey wolf optimizer and binary grey wolf particle swarm optimizer on 6 medical datasets from the UCI machine learning repository, on metrics such as accuracy, size of feature subsets, F-measure, accuracy, precision, and sensitivity. The proposed optimizer outperformed the three other optimizers in 3 of the 6 datasets in average metrics. The proposed optimizer showed promise in its capability to balance the two objectives in feature selection and could be further enhanced.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.2
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据