4.7 Article

Hidden physics models: Machine learning of nonlinear partial differential equations

期刊

JOURNAL OF COMPUTATIONAL PHYSICS
卷 357, 期 -, 页码 125-141

出版社

ACADEMIC PRESS INC ELSEVIER SCIENCE
DOI: 10.1016/j.jcp.2017.11.039

关键词

Probabilistic machine learning; System identification; Bayesian modeling; Uncertainty quantification; Fractional equations; Small data

资金

  1. DARPA EQUiPS [N66001-15-2-4055]
  2. MURI/ARO [W911NF-15-1-0562]
  3. AFOSR [FA9550-17-1-0013]

向作者/读者索取更多资源

While there is currently a lot of enthusiasm about big data, useful data is usually small and expensive to acquire. In this paper, we present a new paradigm of learning partial differential equations from small data. In particular, we introduce hidden physics models, which are essentially data-efficient learning machines capable of leveraging the underlying laws of physics, expressed by time dependent and nonlinear partial differential equations, to extract patterns from high-dimensional data generated from experiments. The proposed methodology may be applied to the problem of learning, system identification, or data-driven discovery of partial differential equations. Our framework relies on Gaussian processes, a powerful tool for probabilistic inference over functions, that enables us to strike a balance between model complexity and data fitting. The effectiveness of the proposed approach is demonstrated through a variety of canonical problems, spanning a number of scientific domains, including the Navier-Stokes, Schrodinger, Kuramoto-Sivashinsky, and time dependent linear fractional equations. The methodology provides a promising new direction for harnessing the long-standing developments of classical methods in applied mathematics and mathematical physics to design learning machines with the ability to operate in complex domains without requiring large quantities of data. (c) 2017 Elsevier Inc. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据