4.7 Article

Knowledge guided Bayesian classification for dynamic multi-objective optimization

期刊

KNOWLEDGE-BASED SYSTEMS
卷 250, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.knosys.2022.109173

关键词

Dynamic multi-objective optimization; Naive Bayesian classification; Evolutionary algorithm; Cluster algorithm

资金

  1. National Natural Science Foundation of China (NSFC) [61876110, JCYJ201908081-64211203]
  2. NSFC [U1713212]
  3. Guangdong Pearl River Talent Recruitment Program [U1713212]
  4. Shenzhen Science and Technology Innovation Commission [2019ZT08X603, R2020A045]

向作者/读者索取更多资源

This paper proposes a knowledge-guided Bayesian classification method for DMOEA, which achieves robust prediction by fully exploiting the information from all historical environments. The experimental results on multiple DMOPs test suits demonstrate that KGB-DMOEA is superior to several state-of-the-art DMOEAs.
Dynamic multi-objective optimization problems (DMOPs) typically contain multiple conflicting objectives that vary over time, requiring the optimization algorithms to quickly track the changing Pareto-optimal solutions (POS). Recently, prediction-based dynamic multi-objective evolutionary algorithms (DMOEAs) using transfer learning ideas have been considered promising, which could accelerate the search towards the POS by transferring information from previous environments to estimate the location of the POS in the next environment. However, most of the existing methods only transfer the search experiences from one or two previous environments to construct the prediction model for the next moment, which may ignore the effective information in the earlier search process thereby degrading the prediction accuracy in some cases. In this paper, a knowledge-guided Bayesian classification for DMOEA, called KGB-DMOEA, is proposed, which aims to achieve a robust prediction by fully exploiting the information from all historical environments. Particularly, when an environmental change is detected, a knowledge reconstruction-examination strategy is designed to divide all historical optimal solutions into useful and useless ones, which are used as positive and negative samples, respectively, for training the prediction model in subsequent steps. Then, a non -linear probabilistic classifier, i.e., naive Bayesian classifier, is constructed by using the above training samples, which can fully mine the effective knowledge from all historical environments and predict a high-quality initial population for the new environment. Experimental results on multiple DMOPs test suits demonstrate that KGB-DMOEA is superior to several state-of-the-art DMOEAs in solving various DMOPs. (C) 2022 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据