4.7 Article

Memory, evolutionary operator, and local search based improved Grey Wolf Optimizer with linear population size reduction technique

Journal

KNOWLEDGE-BASED SYSTEMS
Volume 264, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.knosys.2023.110297

Keywords

Metaheuristics; Swarm intelligence; Grey Wolf Optimizer; Memory; Evolutionary operators; Stochastic local search; Linear population size reduction; Optimization; Algorithm

Ask authors/readers for more resources

This work proposes an improved grey wolf optimization algorithm that addresses the limitations in terms of population diversity, premature convergence, and the balance between exploration and exploitation behavior. The proposed algorithm incorporates memory, evolutionary operators, stochastic local search, and a linear population size reduction technique. Experimental results show that the proposed algorithm outperforms other popular metaheuristics in various benchmark functions and engineering case studies.
Optimization of multi-modal functions is challenging even for evolutionary and swarm-based algorithms as it requires an efficient exploration for finding the promising region of the search space, and effective exploitation to precisely find the global optimum. Grey Wolf Optimizer (GWO) is a recently developed metaheuristic algorithm that is inspired by nature with a relatively small number of parameters for tuning. However, GWO and most of its variants may suffer from the lack of population diversity, premature convergence, and the inability to preserve a good balance between exploratory and exploitative behaviors. To address these limitations, this work proposes a new variant of GWO incorporating memory, evolutionary operators, and a stochastic local search technique. It further integrates Linear Population Size Reduction (LPSR) technique. The proposed algorithm is comprehensively tested on 23 numerical benchmark functions, high dimensional benchmark functions, 13 engineering case studies, four data classifications, and three function approximation problems. The benchmark functions are mostly taken from the CEC 2005 and CEC 2010 special sessions, and they include rotated, shifted functions. The engineering case studies are from the CEC 2020 real-world non-convex constrained optimization problems. The performance of the proposed GWO is compared with popular metaheuristics, namely, particle swarm optimization (PSO), gravitational search algorithm (GSA), slap swarm algorithm (SSA), differential evolution (DE), self-adaptive differential evolution (SADE), basic GWO and its three recently improved variants. Statistical analysis and Friedman tests have been conducted to thoroughly compare their performance. The obtained results demonstrate that the proposed GWO outperforms the algorithms compared for the benchmark functions and engineering case studies tested. (c) 2023 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available