4.7 Article

On the importance of self-consistency in recurrent neural network models representing elasto-plastic solids

期刊

出版社

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.jmps.2021.104697

关键词

Recurrent neural networks; Surrogate models; Finite elements; Mechanical modeling; Homogenization; Deep learning

向作者/读者索取更多资源

This study explores the use of recurrent neural networks as surrogate material models, introducing a new architecture LMSC to address the limitations of existing approaches, demonstrating optimal performance in long sequence applications.
Recurrent neural networks could serve as surrogate material models, removing the gap between component-level finite element simulations and numerically costly microscale models. Recent efforts relied on gated recurrent neural networks. We show the limits of that approach: these networks are not self-consistent, i.e. their response depends on the increment size. We propose a recurrent neural network architecture that integrates self-consistency in its definition: the Linearized Minimal State Cell (LMSC). While LMSCs can be trained on short sequences, they perform best when applied to long sequences of small increments. We consider an elastoplastic example and train small models with fewer than 5000 parameters that precisely replicate the deviatoric elastoplastic behavior, with an optimal number of state-variables. We integrate these models into an explicit finite element framework and demonstrate their performance on component-level simulations with tens of thousands of elements and millions of increments.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据