4.7 Article

Transformers for modeling physical systems

期刊

NEURAL NETWORKS
卷 146, 期 -, 页码 272-289

出版社

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neunet.2021.11.022

关键词

Transformers; Deep learning; Self-attention; Physics; Koopman; Surrogate modeling

资金

  1. Defense Advanced Research Projects Agency (DARPA) under the Physics of Artificial Intel-ligence (PAI) program [HR00111890034]
  2. National Science Foundation (NSF) , USA [DGE-1313583]

向作者/读者索取更多资源

Transformers are widely used in natural language processing, but their applicability outside of this field is limited. This work proposes using transformer models to predict dynamical systems representative of physical phenomena. By using Koopman based embeddings, the dynamical systems can be projected into a vector representation and predicted by a transformer. Experimental results show that the proposed model accurately predicts various dynamical systems and outperforms classical methods.
Transformers are widely used in natural language processing due to their ability to model longer-term dependencies in text. Although these models achieve state-of-the-art performance for many language related tasks, their applicability outside of the natural language processing field has been minimal. In this work, we propose the use of transformer models for the prediction of dynamical systems representative of physical phenomena. The use of Koopman based embeddings provides a unique and powerful method for projecting any dynamical system into a vector representation which can then be predicted by a transformer. The proposed model is able to accurately predict various dynamical systems and outperform classical methods that are commonly used in the scientific machine learning literature. (C) 2021 Elsevier Ltd. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据