Journal
JOURNAL OF COMPUTATIONAL PHYSICS
Volume 335, Issue -, Pages 736-746Publisher
ACADEMIC PRESS INC ELSEVIER SCIENCE
DOI: 10.1016/j.jcp.2017.01.060
Keywords
Machine learning; Integro-differential equations; Multi-fidelity modeling; Uncertainty quantification
Funding
- DARPA grant [N66001-15-2-4055]
Ask authors/readers for more resources
For more than two centuries, solutions of differential equations have been obtained either analytically or numerically based on typically well-behaved forcing and boundary conditions for well-posed problems. We are changing this paradigm in a fundamental way by establishing an interface between probabilistic machine learning and differential equations. We develop data-driven algorithms for general linear equations using Gaussian process priors tailored to the corresponding integro-differential operators. The only observables are scarce noisy multi-fidelity data for the forcing and solution that are not required to reside on the domain boundary. The resulting predictive posterior distributions quantify uncertainty and naturally lead to adaptive solution refinement via active learning. This general framework circumvents the tyranny of numerical discretization as well as the consistency and stability issues of time-integration, and is scalable to high-dimensions. (C) 2017 Elsevier Inc. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available