期刊
JOURNAL OF THE MECHANICS AND PHYSICS OF SOLIDS
卷 158, 期 -, 页码 -出版社
PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.jmps.2021.104697
关键词
Recurrent neural networks; Surrogate models; Finite elements; Mechanical modeling; Homogenization; Deep learning
This study explores the use of recurrent neural networks as surrogate material models, introducing a new architecture LMSC to address the limitations of existing approaches, demonstrating optimal performance in long sequence applications.
Recurrent neural networks could serve as surrogate material models, removing the gap between component-level finite element simulations and numerically costly microscale models. Recent efforts relied on gated recurrent neural networks. We show the limits of that approach: these networks are not self-consistent, i.e. their response depends on the increment size. We propose a recurrent neural network architecture that integrates self-consistency in its definition: the Linearized Minimal State Cell (LMSC). While LMSCs can be trained on short sequences, they perform best when applied to long sequences of small increments. We consider an elastoplastic example and train small models with fewer than 5000 parameters that precisely replicate the deviatoric elastoplastic behavior, with an optimal number of state-variables. We integrate these models into an explicit finite element framework and demonstrate their performance on component-level simulations with tens of thousands of elements and millions of increments.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据