4.6 Article

An Improved Equilibrium Optimizer Algorithm for Features Selection: Methods and Analysis

期刊

IEEE ACCESS
卷 9, 期 -, 页码 120309-120327

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2021.3108097

关键词

Feature extraction; Genetic algorithms; Convergence; Support vector machines; Transfer functions; Search problems; Particle swarm optimization; Equilibrium optimizer (EO); feature selection; optimization; machine learning (ML); opposition based learning (OBL); classification

向作者/读者索取更多资源

In the last decade, challenges have arisen in various fields such as data mining and data science due to high-dimensional datasets and the rapid growth of data volume. The study proposes an improved equilibrium optimizer algorithm (IBEO) with enhancements in diversity and exploitation to address the feature selection problem. Comparative tests with different algorithms show the effectiveness of IBEO in handling high-dimensional datasets and large data volumes.
In the last decade, data generated from different digital devices has posed a remarkable challenge for data representation and analysis. Because of the high-dimensional datasets and the rapid growth of data volume, a lot of challenges have been encountered in various fields such as data mining and data science. Conventional machine learning classifiers are of limited ability to handle the problems of high dimensionality that includes memory limitation, computational cost, and low accuracy performance. Consequently, there is a need to reduce the dimension of datasets by choosing the most significant features that would represent the data efficiently with minimum volume. This study proposes an improved binary version of the equilibrium optimizer algorithm (IBEO) to mitigate features selection problem. Two main enhancements are added to the original equilibrium optimizer (EO) to strengthen its performance. Opposition based learning is the first advancement added to the initialization stage of EO to enhance the diversity of the population in the search space. Local search algorithm is the second advancement added to enhance the exploitation of EO. Wrapper approaches can offer premium solutions. Thus, we used k-nearest neighbour classifier and support vector machine classifiers as the most popular wrapper methods. Moreover, dealing with the problem of over-fitting is an essential task that urges on applying k-fold cross-validation to split each dataset into training and testing data. Comparative tests with different well-known algorithms such as grey wolf optimization, grasshopper optimization, particle swarm optimization, whale optimization, dragonfly, and improved salp swarm algorithms are considered. The proposed algorithm is applied to the most commonly datasets used in the field to validate the performance. Statistical analysis studies demonstrate the effectiveness of the IBEO.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据