4.7 Article

Negative Correlation Hidden Layer for the Extreme Learning Machine

期刊

APPLIED SOFT COMPUTING
卷 109, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.asoc.2021.107482

关键词

Negative Correlation Learning; Extreme Learning Machine; Feature mapping; Hidden layer; Diversity

向作者/读者索取更多资源

The paper introduces a novel ELM architecture, called NCHL-ELM, which improves the performance of ELM models by introducing parameters into nodes and optimizing them to reduce errors in the training set. The method promotes diversity among parameters to enhance the generalization results.
Extreme Learning Machine (ELM) algorithms have achieved unprecedented performance in supervised machine learning tasks. However, the preconfiguration of the nodes in the hidden layer in ELM models through randomness does not always lead to a suitable transformation of the original features. Consequently, the performance of these models relies on broad exploration of these feature mappings, generally using a large number of nodes in the hidden layer. In this paper, a novel ELM architecture is presented, called Negative Correlation Hidden Layer ELM (NCHL-ELM), based on the Negative Correlation Learning (NCL) framework. This model incorporates a parameter into each node in the original ELM hidden layer, and these parameters are optimized by reducing the error in the training set and promoting the diversity among them in order to improve the generalization results. Mathematically, the ELM minimization problem is perturbed by a penalty term, which represents a measure of diversity among the parameters. A variety of regression and classification benchmark datasets have been selected in order to compare NCHL-ELM with other state-of-the-art ELM models. Statistical tests indicate the superiority of our method in both regression and classification problems. (c) 2021 Elsevier B.V. All rights reserved. Superscript/Subscript Available

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据