4.6 Article

Multi-stage dimension reduction for expensive sparse multi-objective optimization problems

Journal

NEUROCOMPUTING
Volume 440, Issue -, Pages 159-174

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2021.01.115

Keywords

Surrogate-assisted evolutionary algorithms; Dimension reduction; Sparse multi-objective optimization& nbsp; problems

Funding

  1. National Natural Science Foundation of China [61976165]

Ask authors/readers for more resources

The proposed multi-stage dimension reduction method effectively reduces the dimension of sparse multi-objective optimization problems by non-dominated sorting based feature selection and adaptive evolutionary process, enabling surrogate-assisted evolutionary algorithms to handle such problems.
A number of sparse multi-objective optimization problems (SMOPs) exist in the real world. Decision variables in their Pareto optimal solutions are not only large-scale but also very sparse, most decision variables are zero, which poses difficulties for the optimization. Existing multi-objective evolutionary algorithms need many function evaluations to handle the large number of decision variables, which is not available for the real-world problems with expensive function evaluations. However, applying surrogate-assisted evolutionary algorithms (SAEAs), proposed for expensive problems, to SMOPs also often falls into the curse of dimensionality. To deal with the dilemma, we propose a multi-stage dimension reduction method for expensive SMOPs to make SAEAs capable to handle. A non-dominated sorting based feature selection is executed by assessing each decision variable independently in the first stage. The sparsity and the specific non-zero decision variables are adaptively determined in an evolutionary process and the dimension of the problem is further reduced accordingly. Then the number of dimension-reduced subproblems are determined by an estimation of the potential calculation cost based on the determined sparsity and non-zero decision variables. Then, an SAEA is adopted for these dimension-reduced subproblems. Each optimal solution obtained is supplemented with a certain number of zero to ensure that its dimension is consistent with the original problem. The number of function evaluations required for each problem is affected by the varying decision variables in the dimension reduction process, so the cost of the proposed algorithm is determined adaptively in different problems. Experiment results on a test suite and one application problem show that the proposed algorithm achieves good performance on SMOPs in the case of limited computation budget. (c) 2021 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available