期刊
EXPERT SYSTEMS WITH APPLICATIONS
卷 96, 期 -, 页码 77-85出版社
PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.eswa.2017.11.054
关键词
Neural networks; Extreme learning machine; Restricted Boltzmann machine; Weights initialization
类别
资金
- Brazilian agency CAPES
- Brazilian agency CNPq [309161/2015-0]
- local Agency of the state of Espirito Santo FAPES [039/2016]
The Extreme Learning Machine (ELM) is a single-hidden layer feedforward neural network (SLFN) learning algorithm that can learn effectively and quickly. The ELM training phase assigns the input weights and bias randomly and does not change them in the whole process. Although the network works well, the random weights in the input layer may affect the algorithm performance. Therefore, we propose a new approach to determine the input weights and bias for the ELM using the restricted Boltzmann machine (RBM), which we call RBM-ELM. We compare our new approach to the well-known ELM-AE and to the ELM-RO, a state of the art algorithm to select the input weights for the ELM. The experimental results show that the RBM-ELM achieves a better performance than the ELM and outperforms the ELM-AE and ELM-RO. (C) 2017 Elsevier Ltd. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据