4.7 Article

Systematic cost analysis of gradient- and anisotropy-enhanced Bayesian design optimization

出版社

SPRINGER
DOI: 10.1007/s00158-022-03324-8

关键词

Bayesian optimization; Gaussian process; Kriging; Topology optimization; Gradient-enhanced Bayesian optimization; Automatic relevance determination

向作者/读者索取更多资源

Predicting global optima for non-convex and expensive objective functions is a challenge in various engineering applications. Bayesian optimization is a powerful method for solving such problems, but selecting the right surrogate model and hyperparameters for optimal convergence speed is a difficult task. This study systematically analyzes the computational costs of Bayesian optimization and evaluates two different modifications for improved performance. The results provide insights into the trade-offs and cost distribution between the modifications, as well as guidelines for implementation in new problems.
The prediction of global optima for non-convex and expensive objective functions is a pervasive challenge across many engineering applications and research areas. Bayesian optimization (BO) is a powerful method for solving optimization problems of this type, as it replaces the expensive search space of the objective function with a less expensive Gaussian process or alternative surrogate model. However, selecting the form and hyperparameters of this surrogate model to optimally represent the design space and maximize the convergence rate is a difficult and non-intuitive challenge. In this work, we conduct a systematic breakdown of the computational costs of the BO framework to reveal how these choices of surrogate formulation and hyperparameters influence overall convergence and prediction quality. We consider two qualitatively different modifications of BO to evaluate for improved performance, specifically gradient-enhanced BO (GEBO) and anisotropy-enhanced automatic relevance determination (ARD). GEBO utilizes available gradient information about the objective function to improve the quality of the surrogate representation and selection of the next evaluation point, but with the trade-off of additional expense. In contrast, ARD utilizes an anisotropic Gaussian process surrogate and relevancy criteria to reduce the search space of the surrogate model and improve convergence by solving a smaller problem. After a systematic analysis of the hyperparameters for both strategies, the methods were benchmarked by solving a fluid mechanics airfoil shape optimization problem and a structural mechanics origami actuator problem. These optimization problems involve 38 to 84 design variables. GEBO exhibited around 3x speedup for all benchmark problems compared to BO without modification, while ARD-enriched BO exhibited a 1.55x speedup on select problems. Manifold analysis of the design space revealed that ARD performed best on problems with a contiguous reduced dimension. Collectively, these results highlight the trade-offs and cost distribution difference between GE and ARD modification for BO and provide guidelines for implementation in new problems.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据