4.7 Article

Deep learning of thermodynamics-aware reduced-order models from data

Journal

Publisher

ELSEVIER SCIENCE SA
DOI: 10.1016/j.cma.2021.113763

Keywords

Deep learning; Sparse autoencoders; Thermodynamics; Model reduction; Structure preserving

Funding

  1. ESI Group through the ESI Chair at ENSAM Arts et Metiers Institute of Technology
  2. University of Zaragoza [2019-0060]
  3. Spanish Ministry of Economy and Competitiveness [CICYT-DPI2017-85139-C2-1-R]
  4. Regional Government of Aragon
  5. European Social Fund

Ask authors/readers for more resources

The algorithm presented in this study uses sparse autoencoders to reduce the dimensionality of a physical system and predict its time evolution using deep neural networks. It ensures the conservation of total energy and entropy inequality, making it suitable for both conservative and dissipative systems. The method has been tested in examples from fluid and solid mechanics.
We present an algorithm to learn the relevant latent variables of a large-scale discretized physical system and predict its time evolution using thermodynamically-consistent deep neural networks. Our method relies on sparse autoencoders, which reduce the dimensionality of the full order model to a set of sparse latent variables with no prior knowledge of the coded space dimensionality. Then, a second neural network is trained to learn the metriplectic structure of those reduced physical variables and predict its time evolution with a so-called structure-preserving neural network. This data-based integrator is guaranteed to conserve the total energy of the system and the entropy inequality, and can be applied to both conservative and dissipative systems. The integrated paths can then be decoded to the original full-dimensional manifold and be compared to the ground truth solution. This method is tested with two examples applied to fluid and solid mechanics. (C) 2021 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available