4.7 Article

A new mechanical approach to handle generalized Hopfield neural networks

Journal

NEURAL NETWORKS
Volume 106, Issue -, Pages 205-222

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neunet.2018.07.010

Keywords

Hopfield networks; Statistical mechanics; Unlearning; Non-linear PDE; Relativistic mechanics

Funding

  1. GNFM-INdAM via AGLIARI2018
  2. MIUR
  3. Rete Match: Progetto Pythagoras [CUP:J48C17000250006]

Ask authors/readers for more resources

We propose a modification of the cost function of the Hopfield model whose salient features shine in its Taylor expansion and result in more than pairwise interactions with alternate signs, suggesting a unified framework for handling both with deep learning and network pruning. In our analysis, we heavily rely on the Hamilton-Jacobi correspondence relating the statistical model with a mechanical system. In this picture, our model is nothing but the relativistic extension of the original Hopfield model (whose cost function is a quadratic form in the Mattis magnetization and mimics the non-relativistic counterpart, the so-called classical limit). We focus on the low-storage regime and solve the model analytically by taking advantage of the mechanical analogy, thus obtaining a complete characterization of the free energy and the associated self-consistency equations in the thermodynamic limit. Further, on the numerical side, we test the performances of our proposal with extensive Monte Carlo simulations, showing that the stability of spurious states (limiting the capabilities of the standard Hebbian construction) is sensibly reduced due to presence of unlearning contributions that prune them massively. (C) 2018 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available