4.6 Article

Extreme learning machine with local connections

期刊

NEUROCOMPUTING
卷 368, 期 -, 页码 146-152

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2019.08.069

关键词

Extreme learning machine; Local connections; Sparsification of input-hidden weights; High dimensional input data

资金

  1. National Science Foundation of China [61473059, 61403056]
  2. Fundamental Research Funds for the Central Universities of China [3132019176]

向作者/读者索取更多资源

This paper is concerned with the sparsification of the input-hidden weights of ELM (extreme learning machine). For ordinary feedforward neural networks, the sparsification is usually done by introducing certain regularization technique into the learning process of the network. However, this strategy cannot be applied for ELM, since the input-hidden weights of ELM are supposed to be randomly chosen rather than iteratively learned. To this end, we propose a modified ELM, called ELM-LC (ELM with local connections), which is designed for the sparsification of the input-hidden weights as follows: The hidden nodes and the input nodes are divided respectively into several corresponding groups, and each input node group is fully connected with its corresponding hidden node group, but is not connected with any other hidden node group. As in the usual ELM, the input-hidden weights are randomly given, and the hidden-output weights are obtained through a least square learning. In the numerical simulations on some benchmark problems, the new ELM-LC behaves better than the traditional ELM and the ELM with normal sparse input-hidden weights. (C) 2019 Published by Elsevier B.V.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据