Journal
PATTERN RECOGNITION
Volume 47, Issue 1, Pages 25-39Publisher
ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2013.05.025
Keywords
Restricted Boltzmann machines; Markov random fields; Markov chains; Gibbs sampling; Neural networks; Contrastive divergence learning; Parallel tempering
Funding
- German Federal Ministry of Education and Research within the National Network Computational Neuroscience [01GQ0951]
- European Commission through project AKMI [PCIG10-GA-2011-303655]
Ask authors/readers for more resources
Restricted Boltzmann machines (RBMs) are probabilistic graphical models that can be interpreted as stochastic neural networks. They have attracted much attention as building blocks for the multi-layer learning systems called deep belief networks, and variants and extensions of RBMs have found application in a wide range of pattern recognition tasks. This tutorial introduces RBMs from the viewpoint of Markov random fields, starting with the required concepts of undirected graphical models. Different learning algorithms for RBMs, including contrastive divergence learning and parallel tempering, are discussed. As sampling from RBMs, and therefore also most of their learning algorithms, are based on Markov chain Monte Carlo (MCMC) methods, an introduction to Markov chains and MCMC techniques is provided. Experiments demonstrate relevant aspects of RBM training. (C) 2013 Elsevier Ltd. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available