4.6 Article

Time-series learning of latent-space dynamics for reduced-order model closure

期刊

PHYSICA D-NONLINEAR PHENOMENA
卷 405, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.physd.2020.132368

关键词

ROMs; LSTMs; Neural ODEs; Closures

资金

  1. U.S. Department of Energy (DOE), Office of Science, USA, Office of Advanced Scientific Computing Research, USA [DE-AC02-06CH11357]
  2. DOE, USA Office of Science User Facility [DE-AC02-06CH11357]
  3. Los Alamos National Laboratory, USA, 2019 LDRD grant Machine Learning for Turbulence''
  4. National Nuclear Security Administration of US Department of Energy [89233218CNA000001]

向作者/读者索取更多资源

We study the performance of long short-term memory networks (LSTMs) and neural ordinary differential equations (NODEs) in learning latent-space representations of dynamical equations for an advection-dominated problem given by the viscous Burgers equation. Our formulation is devised in a nonintrusive manner with an equation-free evolution of dynamics in a reduced space with the latter being obtained through a proper orthogonal decomposition. In addition, we leverage the sequential nature of learning for both LSTMs and NODEs to demonstrate their capability for closure in systems that are not completely resolved in the reduced space. We assess our hypothesis for two advection-dominated problems given by the viscous Burgers equation. We observe that both LSTMs and NODEs are able to reproduce the effects of the absent scales for our test cases more effectively than does intrusive dynamics evolution through a Galerkin projection. This result empirically suggests that time-series learning techniques implicitly leverage a memory kernel for coarse-grained system closure as is suggested through the Mori-Zwanzig formalism. (c) 2020 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据