Journal
JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT
Volume 2022, Issue 7, Pages -Publisher
IOP Publishing Ltd
DOI: 10.1088/1742-5468/ac764a
Keywords
analysis of algorithms; machine learning; message-passing algorithms; statistical inference
Categories
Funding
- German Research Foundation
- Deutsche Forschungsgemeinschaft (DFG), under Grant 'RAMABIM' [OP 45/9-1]
- US National Science Foundation [CCF-1910410]
- Harvard FAS Dean's Competitive Fund for Promising Scholarship
Ask authors/readers for more resources
This paper analyzes a random sequential message passing algorithm for large Gaussian latent variable models. By assuming random covariance matrices and considering model mismatch, the authors obtain dynamical mean-field equations characterizing the dynamics of the inference algorithm, and derive the parameter range for which the algorithm does not converge.
We analyze the dynamics of a random sequential message passing algorithm for approximate inference with large Gaussian latent variable models in a student-teacher scenario. To model nontrivial dependencies between the latent variables, we assume random covariance matrices drawn from rotation invariant ensembles. Moreover, we consider a model mismatching setting, where the teacher model and the one used by the student may be different. By means of dynamical functional approach, we obtain exact dynamical mean-field equations characterizing the dynamics of the inference algorithm. We also derive a range of model parameters for which the sequential algorithm does not converge. The boundary of this parameter range coincides with the de Almeida Thouless (AT) stability condition of the replica-symmetric ansatz for the static probabilistic model.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available