期刊
STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION
卷 64, 期 6, 页码 4355-4365出版社
SPRINGER
DOI: 10.1007/s00158-021-03025-8
关键词
Automatic differentiation; Topology optimization; JAX; Sensitivity analysis; Machine learning; Educational code
资金
- National Science Foundation [CMMI 1561899]
This paper introduces the use of automatic differentiation (AD) for computing sensitivities in topology optimization (TO) problems. The framework called AuTO, implemented using the high-performance Python library JAX, is illustrated through several examples in compliance minimization, compliant mechanism design, and microstructural design.
A critical step in topology optimization (TO) is finding sensitivities. Manual derivation and implementation of sensitivities can be quite laborious and error-prone, especially for non-trivial objectives, constraints and material models. An alternate approach is to utilize automatic differentiation (AD). While AD has been around for decades, and has also been applied in TO, its wider adoption has largely been absent. In this educational paper, we aim to reintroduce AD for TO, making it easily accessible through illustrative codes. In particular, we employ JAX, a high-performance Python library for automatically computing sensitivities from a user-defined TO problem. The resulting framework, referred to here as AuTO, is illustrated through several examples in compliance minimization, compliant mechanism design and microstructural design.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据