Journal
APPLIED MATHEMATICS AND COMPUTATION
Volume 458, Issue -, Pages -Publisher
ELSEVIER SCIENCE INC
DOI: 10.1016/j.amc.2023.128253
Keywords
-
Categories
Ask authors/readers for more resources
While Hopfield networks are widely used for memory storage and retrieval, this study explores the possibility of using Boltzmann machines for self-supervised learning. By generalizing the Hopfield model with structured patterns, the learning performance is analyzed based on the size of the training set, dataset noise, and weight regularization. The results show that with an informative dataset, the machine can learn through memorization, while with a noisy dataset, a critical number of examples is needed for generalization.
While Hopfield networks are known as paradigmatic models for memory storage and retrieval, modern artificial intelligence systems mainly stand on the machine learning paradigm. We show that it is possible to formulate a teacher-student self-supervised learning problem with Boltzmann machines in terms of a suitable generalization of the Hopfield model with structured patterns, where the spin variables are the machine weights and patterns correspond to the training set's examples. We analyze the learning performance by studying the phase diagram in terms of the training set size, the dataset noise and the inference temperature (i.e. the weight regularization). With a small but informative dataset the machine can learn by memorization. With a noisy dataset, an extensive number of examples above a critical threshold is needed. In this regime the memory storage limits become an opportunity for the occurrence of a learning regime in which the system can generalize.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available