4.1 Article

Error Minimized Extreme Learning Machine With Growth of Hidden Nodes and Incremental Learning

期刊

IEEE TRANSACTIONS ON NEURAL NETWORKS
卷 20, 期 8, 页码 1352-1357

出版社

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNN.2009.2024147

关键词

Echo state network (ESN); extreme learning machine (ELM); feedforward neural networks (FNNs); growing algorithm; incremental learning; minimizing error; sequential learning

向作者/读者索取更多资源

One of the open problems in neural network research is how to automatically determine network architectures for given applications. In this brief, we propose a simple and efficient approach to automatically determine the number of hidden nodes in generalized single-hidden-layer feedforward networks (SLFNs) which need not be neural alike. This approach referred to as error minimized extreme learning machine (EM-ELM) can add random hidden nodes to SLFNs one by one or group by group (with varying group size). During the growth of the networks, the output weights are updated incrementally. The convergence of this approach is proved in this brief as well. Simulation results demonstrate and verify that our new approach is much faster than other sequential/incremental/growing algorithms with good generalization performance.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.1
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据