4.7 Article

An efficient surrogate-assisted hybrid optimization algorithm for expensive optimization problems

Journal

INFORMATION SCIENCES
Volume 561, Issue -, Pages 304-325

Publisher

ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2020.11.056

Keywords

Surrogate-assisted; Hybrid optimization algorithm; Teaching-learning-based optimization; Differential evolution; Expensive problems

Funding

  1. National Natural Science Foundation of China [61872085, 61702101]
  2. Fujian Provincial Department of Science and Technology [2018Y3001]
  3. Natural Science Foundation of Fujian Province [2018J01638]

Ask authors/readers for more resources

Surrogate-assisted evolutionary algorithms (SAEAs) combine the searching capabilities of evolutionary algorithms with the predictive capabilities of surrogate models, and an efficient SAHO algorithm integrates TLBO and DE algorithms, alternating between global exploration and local exploitation when better solutions cannot be found, with a new prescreening criterion selecting promising candidates for evaluations, and using a local RBF surrogate model to mimic the target function landscape.
Surrogate-assisted evolutionary algorithms (SAEAs) are potential approaches to solve computationally expensive optimization problems. The critical idea of SAEAs is to combine the powerful searching capabilities of evolutionary algorithms with the predictive capabilities of surrogate models. In this study, an efficient surrogate-assisted hybrid optimization (SAHO) algorithm is proposed via combining two famous algorithms, namely, teaching-learning-based optimization (TLBO) and differential evolution (DE). The TLBO is focused on global exploration and the DE is concentrated on local exploitation. These two algorithms are carried out alternately when no better candidate solution can be found. Meanwhile, a new prescreening criterion based on the best and top collection information is introduced to choose promising candidates for real function evaluations. Besides, two evolution control (i.e., the generation-based and individual-based) strategies and a top-ranked restart strategy are integrated in the SAHO. Moreover, a local RBF surrogate which does not need too many training samples is employed to model the landscapes of the target function. Sixteen benchmark functions and the tension/compression spring design problem are adopted to compare the proposed SAHO with other state-of-the-art approaches. Extensive comparison results demonstrate that the proposed SAHO has superior performance for solving expensive optimization problems. (C) 2020 Elsevier Inc. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available