4.7 Article

Enhancing performance of restricted Boltzmann machines via log-sum regularization

期刊

KNOWLEDGE-BASED SYSTEMS
卷 63, 期 -, 页码 82-96

出版社

ELSEVIER
DOI: 10.1016/j.knosys.2014.03.016

关键词

Restricted Boltzmann machine; Sparsity; Log-sum regularization; Deep belief network; Feature learning

资金

  1. National Basic Research Program of China (973 Program) [2013CB329404]
  2. National Natural Science Foundation of China [91230101, 11131006, 11201367]

向作者/读者索取更多资源

Restricted Boltzmann machines (RBMs) are often used as building blocks to construct a deep belief network. By optimizing several RBMs, the deep networks can be trained quickly to achieve good performance on the tasks of interest. To further improve the performance of data representation, many researches focus on incorporating sparsity into RBMs. In this paper, we propose a novel sparse RBM model, referred to as LogSumRBM. Instead of constraining the expected activation of every hidden unit to the same low level of sparsity as done in [27], we explicitly encourage the hidden units to be sparse through adding a log-sum norm constraint on the totality of the hidden units' activation probabilities. In this approach, we do not need to keep the firing rate of each hidden unit at a certain level that is set beforehand, and therefore the level of sparsity corresponding to each hidden unit can be automatically learnt based on the task at hand. Some experiments conducted on several image data sets of different scales show that LogSumRBM learns sparser and more discriminative representations compared with the related state-of-the-art models, and stacking two LogSumRBMs learns more significant features which mimic computations in the cortical hierarchy. Meanwhile, LogSumRBM can also be used to pre-train deep networks, and achieve better classification performance. (C) 2014 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据