期刊
PHYSICA D-NONLINEAR PHENOMENA
卷 409, 期 -, 页码 -出版社
ELSEVIER
DOI: 10.1016/j.physd.2020.132471
关键词
Hybrid analysis and modeling; Supervised machine learning; Long short-term memory; Model reduction; Galerkin projection; Grassmann manifold
资金
- U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research [DE-SC0019290]
- National Science Foundation [DMS1821145]
In this paper, we introduce an uplifted reduced order modeling (UROM) approach through the integration of standard projection based methods with long short-term memory (LSTM) embedding. Our approach has three modeling layers or components. In the first layer, we utilize an intrusive projection approach to model the dynamics represented by the largest modes. The second layer consists of an LSTM model to account for residuals beyond this truncation. This closure layer refers to the process of including the residual effect of the discarded modes into the dynamics of the largest scales. However, the feasibility of generating a low rank approximation tails off for higher Kolmogorov n-width systems due to the underlying nonlinear processes. The third uplifting layer, called super-resolution, addresses this limited representation issue by expanding the span into a larger number of modes utilizing the versatility of LSTM. Therefore, our model integrates a physics-based projection model with a memory embedded LSTM closure and an LSTM based super-resolution model. In several applications, we exploit the use of Grassmann manifold to construct UROM for unseen conditions. We perform numerical experiments by using the Burgers and Navier-Stokes equations with quadratic nonlinearity. Our results show the robustness of the proposed approach in building reduced order models for parameterized systems and confirm the improved trade-off between accuracy and efficiency. (C) 2020 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据