Journal
NEURAL NETWORKS
Volume 146, Issue -, Pages 272-289Publisher
PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neunet.2021.11.022
Keywords
Transformers; Deep learning; Self-attention; Physics; Koopman; Surrogate modeling
Funding
- Defense Advanced Research Projects Agency (DARPA) under the Physics of Artificial Intel-ligence (PAI) program [HR00111890034]
- National Science Foundation (NSF) , USA [DGE-1313583]
Ask authors/readers for more resources
Transformers are widely used in natural language processing, but their applicability outside of this field is limited. This work proposes using transformer models to predict dynamical systems representative of physical phenomena. By using Koopman based embeddings, the dynamical systems can be projected into a vector representation and predicted by a transformer. Experimental results show that the proposed model accurately predicts various dynamical systems and outperforms classical methods.
Transformers are widely used in natural language processing due to their ability to model longer-term dependencies in text. Although these models achieve state-of-the-art performance for many language related tasks, their applicability outside of the natural language processing field has been minimal. In this work, we propose the use of transformer models for the prediction of dynamical systems representative of physical phenomena. The use of Koopman based embeddings provides a unique and powerful method for projecting any dynamical system into a vector representation which can then be predicted by a transformer. The proposed model is able to accurately predict various dynamical systems and outperform classical methods that are commonly used in the scientific machine learning literature. (C) 2021 Elsevier Ltd. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available