4.7 Article

Accelerated topology optimization by means of deep learning

期刊

STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION
卷 62, 期 3, 页码 1185-1212

出版社

SPRINGER
DOI: 10.1007/s00158-020-02545-z

关键词

Topology optimization; Deep learning; Deep belief networks; Restricted Boltzmann machines; Pattern recognition; SIMP

资金

  1. OptArch project: Optimization Driven Architectural Design of Structures [689983]

向作者/读者索取更多资源

This study is focused on enhancing the computational efficiency of the solid isotropic material with penalization (SIMP) approach implemented for solving topology optimization problems. Solving such problems might become extremely time-consuming; in this direction, machine learning (ML) and specifically deep neural computing are integrated in order to accelerate the optimization procedure. The capability of ML-based computational models to extract multiple levels of representation of non-linear input data has been implemented successfully in various problems ranging from time series prediction to pattern recognition. The later one triggered the development of the methodology proposed in the current study that is based on deep belief networks (DBNs). More specifically, a DBN is calibrated on transforming the input data to a new higher-level representation. Input data contains the density fluctuation pattern of the finite element discretization provided by the initial steps of SIMP approach, and output data corresponds to the resulted density values distribution over the domain as obtained by SIMP. The representation capabilities and the computational advantages offered by the proposed DBN-based methodology coupled with the SIMP approach are investigated in several benchmark topology optimization test examples where it is observed more than one order of magnitude reduction on the iterations that were originally required by SIMP, while the advantages become more pronounced in case of large-scale problems.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据