4.2 Article

Discovering invariants via machine learning

Journal

PHYSICAL REVIEW RESEARCH
Volume 3, Issue 0.4, Pages -

Publisher

AMER PHYSICAL SOC
DOI: 10.1103/PhysRevResearch.3.L042035

Keywords

-

Funding

  1. Basic Science Research Program through the National Research Foundation of Korea [NRF-2017R1A2B3006930]

Ask authors/readers for more resources

ConservNet is a neural network that learns hidden invariants from grouped data with an intuitive loss function, making it robust to various noise and data conditions, and directly applicable to experimental data for discovering hidden conservation laws and general relationships between variables.
Invariants and conservation laws convey critical information about the underlying dynamics of a system, yet it is generally infeasible to find them from large-scale data without any prior knowledge or human insight. We propose ConservNet to achieve this goal, a neural network that spontaneously discovers a conserved quantity from grouped data where the members of each group share invariants, similar to a general experimental setting where trajectories from different trials are observed. As a neural network trained with an intuitive loss function called noise-variance loss, ConservNet learns the hidden invariants in each group of multidimensional observables in a data-driven end-to-end manner. Our model successfully discovers underlying invariants from the simulated systems having invariants as well as a real-world double-pendulum trajectory. Since the model is robust to various noises and data conditions compared to the baseline, our approach is directly applicable to experimental data for discovering hidden conservation laws and further, general relationships between variables.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.2
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available