4.7 Article

Training restricted Boltzmann machines: An introduction

Journal

PATTERN RECOGNITION
Volume 47, Issue 1, Pages 25-39

Publisher

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2013.05.025

Keywords

Restricted Boltzmann machines; Markov random fields; Markov chains; Gibbs sampling; Neural networks; Contrastive divergence learning; Parallel tempering

Funding

  1. German Federal Ministry of Education and Research within the National Network Computational Neuroscience [01GQ0951]
  2. European Commission through project AKMI [PCIG10-GA-2011-303655]

Ask authors/readers for more resources

Restricted Boltzmann machines (RBMs) are probabilistic graphical models that can be interpreted as stochastic neural networks. They have attracted much attention as building blocks for the multi-layer learning systems called deep belief networks, and variants and extensions of RBMs have found application in a wide range of pattern recognition tasks. This tutorial introduces RBMs from the viewpoint of Markov random fields, starting with the required concepts of undirected graphical models. Different learning algorithms for RBMs, including contrastive divergence learning and parallel tempering, are discussed. As sampling from RBMs, and therefore also most of their learning algorithms, are based on Markov chain Monte Carlo (MCMC) methods, an introduction to Markov chains and MCMC techniques is provided. Experiments demonstrate relevant aspects of RBM training. (C) 2013 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available