4.7 Article

Average convergence rate of evolutionary algorithms in continuous optimization

期刊

INFORMATION SCIENCES
卷 562, 期 -, 页码 200-219

出版社

ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2020.12.076

关键词

Evolutionary algorithm; Continuous optimization; Convergence rate; Markov chain; Approximation error

资金

  1. National Science Foundation of China [61303028]
  2. Fundamental Research Funds for the Central Universities [WUT: 2020IB006]
  3. EPSRC [EP/I009809/1]
  4. EPSRC [EP/I009809/1] Funding Source: UKRI

向作者/读者索取更多资源

This paper conducts a theoretical analysis of the ACR in continuous optimization, revealing that there are two categories of ACR: linear and sublinear, and it is also classified into two categories with the decision space dimension.
The average convergence rate (ACR) measures how fast the approximation error of an evolutionary algorithm converges to zero per generation. It is defined as the geometric average of the reduction rate of the approximation error over consecutive generations. This paper makes a theoretical analysis of the ACR in continuous optimization. The obtained results are summarized as follows. According to the limit property, the ACR is classified into two categories: (1) linear ACR whose limit inferior value is larger than a positive and (2) sublinear ACR whose value converges to zero. Then, it is proven that the ACR is linear for evolutionary programming using positive landscape-adaptive mutation, but sublinear for that using landscape-invariant or zero landscape-adaptive mutation. The relationship between the ACR and the decision space dimension is also classified into two categories: (1) polynomial ACR whose value is larger than the reciprocal of a polynomial function of the dimension for any generation, and (2) exponential ACR whose value is less than the reciprocal of an exponential function of the dimension for an exponential long period. It is proven that for easy problems such as linear functions, the ACR of the (1 + 1) adaptive random univariate search is polynomial. But for hard functions such as the deceptive function, the ACR of both the (1 + 1) adaptive random univariate search and evolutionary programming is exponential. (c) 2021 Elsevier Inc. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据