4.7 Article

MeLM, a generative pretrained language modeling framework that solves forward and inverse mechanics problems

期刊

出版社

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.jmps.2023.105454

关键词

Mechanics; Attention; Transformer; Language model; Forward; Inverse; Design; Modeling; Multiscale; Atomistic; Encoding; Representation; Causal; Emergent; Collective; Graph neural network; GPT; Human -machine interaction

向作者/读者索取更多资源

We introduce a flexible multi-modal mechanics language model, MeLM, that can solve various nonlinear forward and inverse problems in different fields, such as bio-inspired design, nanotube mechanics, and protein unfolding. MeLM effectively represents large-scale systems and performs well in complex mechanics design problems, offering potential for materials discovery and analysis.
We report a flexible multi-modal mechanics language model, MeLM, applied to solve various nonlinear forward and inverse problems, that can deal with a set of instructions, numbers and microstructure data. The framework is applied to various examples including bio-inspired hierarchical honeycomb design, carbon nanotube mechanics, and protein unfolding. In spite of the adaptable nature of the model-which allows us to easily incorporate diverse materials, scales, and mechanical features-it performs well across disparate forward and inverse tasks. Based on an autoregressive attention-model, MeLM effectively represents a large multi-particle system consisting of hundreds of millions of neurons, where the interaction potentials are discovered through graph-forming self-attention mechanisms that are then used to identify relationships from emergent structures, while taking advantage of synergies discovered in the training data. We show that the model can solve complex degenerate mechanics design problems and determine novel material architectures across a range of hierarchical levels, providing an avenue for materials discovery and analysis. To illustrate the use case for broader possibilities, we outline a human-machine interactive MechGPT model, here trained on a set of 1,103 Wikipedia articles related to mechanics, showing how the general framework can be used not only to solve forward and inverse problems but in addition, for complex language tasks like summarization, generation of new research concepts, and knowledge extraction. Looking beyond the demonstrations reported in this paper, we discuss other opportunities in applied mechanics and general considerations about the use of large language models in modeling, design, and analysis that can span a broad spectrum of material properties from mechanical, thermal, optical, to electronic.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据