4.5 Article

GFINNs: GENERIC formalism informed neural networks for deterministic and stochastic dynamical systems

Publisher

ROYAL SOC
DOI: 10.1098/rsta.2021.0207

Keywords

data-driven discovery; physics-informed neural networks; GENERIC formalism; interpretable scientific machine learning

Ask authors/readers for more resources

This paper proposes the GENERIC formalism informed neural networks (GFINNs) that obey the symmetric degeneracy conditions of the GENERIC formalism. By leveraging the component-wise architecture design, GFINNs are able to learn the underlying equations and outperform existing methods in simulation problems.
We propose the GENERIC formalism informed neural networks (GFINNs) that obey the symmetric degeneracy conditions of the GENERIC formalism. GFINNs comprise two modules, each of which contains two components. We model each component using a neural network whose architecture is designed to satisfy the required conditions. The component-wise architecture design provides flexible ways of leveraging available physics information into neural networks. We prove theoretically that GFINNs are sufficiently expressive to learn the underlying equations, hence establishing the universal approximation theorem. We demonstrate the performance of GFINNs in three simulation problems: gas containers exchanging heat and volume, thermoelastic double pendulum and the Langevin dynamics. In all the examples, GFINNs outperform existing methods, hence demonstrating good accuracy in predictions for both deterministic and stochastic systems.This article is part of the theme issue 'Data-driven prediction in dynamical systems'.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available