4.7 Article

Learning nonlinear turbulent dynamics from partial observations via analytically solvable conditional statistics

Journal

JOURNAL OF COMPUTATIONAL PHYSICS
Volume 418, Issue -, Pages -

Publisher

ACADEMIC PRESS INC ELSEVIER SCIENCE
DOI: 10.1016/j.jcp.2020.109635

Keywords

Expectation-maximization approach; Nonlinear optimal smoother; Short training data; Physics constraint; Block decomposition; Sparse identification

Funding

  1. Office of Vice Chancellor for Research and Graduate Education (VCRGE) at University of Wisconsin-Madison
  2. Office of Naval Research (ONR) MURI [N00014-19-1-2421]

Ask authors/readers for more resources

Learning nonlinear turbulent dynamics from partial observations is an important and challenging topic. In this article, an efficient learning algorithm based on the expectation-maximization approach is developed for a rich class of complex nonlinear turbulent dynamics using short training data. Despite the significant nonlinear and non-Gaussian features in these models, the analytically solvable conditional statistics allows the development of an exact and accurate nonlinear optimal smoother for recovering the hidden variables, which facilitates an efficient learning of these fully nonlinear models with extreme events. Then three additional ingredients are incorporated into the basic algorithm for improving the learning process. First, the physics constraint that requires the conservation of energy in the quadratic nonlinear terms is taken into account. It plays an important role in preventing the finite-time blowup of the solution and various pathological behavior of the recovered model. Second, a judicious block decomposition is applied to many large-dimensional nonlinear systems. It greatly accelerates the calculation of high-dimensional conditional covariance matrix and provides an extremely cheap parallel computation for learning the model parameters. Third, sparse identification of the complex turbulent models is combined with the learning algorithm that leads to parsimonious models. Numerical tests show the skill of the algorithm in learning the nonlinear dynamics and non-Gaussian statistics with extreme events in both perfect model and model error scenarios. It is also shown that in the presence of noise and partial observations, the model is not uniquely identified. Different nonlinear models all perfectly capture the key non-Gaussian features and obtain the same ensemble forecast skill of the observed variables as the perfect model, but they may have distinct model responses to external perturbations. (C) 2020 Elsevier Inc. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available