Journal
PHYSICA D-NONLINEAR PHENOMENA
Volume 416, Issue -, Pages -Publisher
ELSEVIER
DOI: 10.1016/j.physd.2020.132797
Keywords
Reduced-order models; Deep learning; Gaussian process regression
Categories
Funding
- Margaret Butler Fellowship, United States at the Argonne Leadership Computing Facility
- Wave 1 of The UKRI Strategic Priorities Fund under the EPSRC [EP/T001569/1]
- Digital Twins for Complex Engineering Systems
- Alan Turing Institute
- Imperial College Research Fellowship scheme, United Kingdom
- U.S. Department of Energy (DOE), Office of Science, Office of Advanced Scientific Computing Research [DE-AC02-06CH11357]
- DOE Office of Science User Facility, United States [DE-AC02-06CH11357]
- EPSRC [EP/T001569/1, EP/T000414/1] Funding Source: UKRI
Ask authors/readers for more resources
Non-intrusive reduced-order models (ROMs) provide a low-dimensional emulation framework for high-dimensional systems through a purely data-driven construction algorithm. The use of a novel latent-space interpolation algorithm based on Gaussian process regression allows for interpolation in both space and time, offering information and uncertainty evaluation for full-state evolution.
Non-intrusive reduced-order models (ROMs) have recently generated considerable interest for constructing computationally efficient counterparts of nonlinear dynamical systems emerging from various domain sciences. They provide a low-dimensional emulation framework for systems that may be intrinsically high-dimensional. This is accomplished by utilizing a construction algorithm that is purely data-driven. It is no surprise, therefore, that the algorithmic advances of machine learning have led to non-intrusive ROMs with greater accuracy and computational gains. However, in bypassing the utilization of an equation-based evolution, it is often seen that the interpretability of the ROM framework suffers. This becomes more problematic when black-box deep learning methods are used which are notorious for lacking robustness outside the physical regime of the observed data. In this article, we propose the use of a novel latent-space interpolation algorithm based on Gaussian process regression. Notably, this reduced-order evolution of the system is parameterized by control parameters to allow for interpolation in space. The use of this procedure also allows for a continuous interpretation of time which allows for temporal interpolation. The latter aspect provides information, with quantified uncertainty, about full-state evolution at a finer resolution than that utilized for training the ROMs. We assess the viability of this algorithm for an advection-dominated system given by the inviscid shallow water equations. (C) 2020 Elsevier B.V. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available