4.7 Article

Integrating Conjugate Gradients Into Evolutionary Algorithms for Large-Scale Continuous Multi-Objective Optimization

Journal

IEEE-CAA JOURNAL OF AUTOMATICA SINICA
Volume 9, Issue 10, Pages 1801-1817

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/JAS.2022.105875

Keywords

Conjugate gradient; differential evolution; evolutionary computation; large-scale multi-objective optimization; mathematical programming

Funding

  1. National Key Research and Development Program of China [2018AAA0100100]
  2. National Natural Science Foundation of China [61906001, 62136008, U21A20512]
  3. Key Program of Natural Science Project of Educational Commission of Anhui Province [KJ2020A0036]
  4. Alexander von Humboldt Professorship for Artificial Intelligence - Federal Ministry of Education and Research, Germany

Ask authors/readers for more resources

In this paper, a hybrid algorithm is proposed to solve large-scale multi-objective optimization problems (LSMOPs) by combining differential evolution and conjugate gradient method. The proposed algorithm exhibits better convergence and diversity performance compared to existing evolutionary algorithms, mathematical programming methods, and hybrid algorithms on various benchmark and real-world LSMOPs.
Large-scale multi-objective optimization problems (LSMOPs) pose challenges to existing optimizers since a set of well-converged and diverse solutions should be found in huge search spaces. While evolutionary algorithms are good at solving small-scale multi-objective optimization problems, they are criticized for low efficiency in converging to the optimums of LSMOPs. By contrast, mathematical programming methods offer fast convergence speed on large-scale single-objective optimization problems, but they have difficulties in finding diverse solutions for LSMOPs. Currently, how to integrate evolutionary algorithms with mathematical programming methods to solve LSMOPs remains unexplored. In this paper, a hybrid algorithm is tailored for LSMOPs by coupling differential evolution and a conjugate gradient method. On the one hand, conjugate gradients and differential evolution are used to update different decision variables of a set of solutions, where the former drives the solutions to quickly converge towards the Pareto front and the latter promotes the diversity of the solutions to cover the whole Pareto front. On the other hand, objective decomposition strategy of evolutionary multi-objective optimization is used to differentiate the conjugate gradients of solutions, and the line search strategy of mathematical programming is used to ensure the higher quality of each offspring than its parent. In comparison with state-of-the-art evolutionary algorithms, mathematical programming methods, and hybrid algorithms, the proposed algorithm exhibits better convergence and diversity performance on a variety of benchmark and real-world LSMOPs.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available