4.8 Article

Large-Deviation Approach to Random Recurrent Neuronal Networks: Parameter Inference and Fluctuation-Induced Transitions

Journal

PHYSICAL REVIEW LETTERS
Volume 127, Issue 15, Pages -

Publisher

AMER PHYSICAL SOC
DOI: 10.1103/PhysRevLett.127.158302

Keywords

-

Funding

  1. Helmholtz young investigator's group [VH-NG-1028]
  2. European Union Horizon 2020 Grant [785907]
  3. Human Frontier Science Program [RGP0057/2016]
  4. BMBF Grant Renormalized Flows [01IS19077A]
  5. Excellence Initiative of the German federal and state governments [G:(DE-82)EXS-PF-JARASDS005]

Ask authors/readers for more resources

This study unifies the field-theoretical approach to neuronal networks with large deviations theory, deriving a rate function resembling Kullback-Leibler divergence through field theory to enable data-driven inference of model parameters and calculation of fluctuations beyond mean-field theory. Additionally, the study reveals a regime with fluctuation-induced transitions between mean-field solutions.
We here unify the field-theoretical approach to neuronal networks with large deviations theory. For a prototypical random recurrent network model with continuous-valued units, we show that the effective action is identical to the rate function and derive the latter using field theory. This rate function takes the form of a Kullback-Leibler divergence which enables data-driven inference of model parameters and calculation of fluctuations beyond mean-field theory. Lastly, we expose a regime with fluctuation-induced transitions between mean-field solutions.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available