期刊
INFORMATION SCIENCES
卷 540, 期 -, 页码 1-16出版社
ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2020.05.112
关键词
Stochastic configuration networks; Randomized neural networks; Alternating direction method of multipliers; Distributed learning
资金
- National Natural Science Foundation of China [61703117]
- Natural Science Foundation of Guangxi, China [2017GXNSFBA198113]
- Foundation of Guilin University of Technology [GLUTQD2007029]
- China Scholarship Council [201708455036]
- National Key R&D Program of China [2018AAA0100300]
As a new category of randomized neural networks (RNNs), stochastic configuration networks (SCNs) have demonstrated great potential for data analytics. Unlike conventional randomized learning techniques, e.g., random vector functional-link (RVFL) networks, SCNs provide a stochastic configuration mechanism on the assignment of input parameters which guarantees the universal approximation capability of a resulting learner model. In this paper, a distributed version of SCN is developed for decentralized datasets in cooperative learning paradigm. This paper proposes an approach to deal with datasets stored across a network of multiple learning agents without any fusion center. Specifically, we formulate the centralized learning problem as an equivalent form with the decomposition of subproblems coupled in a network and a consensus restriction. Then, a cooperative configuration scheme is proposed for randomly assigning the input weights and bias. Finally, based on the well-known parallel alternating direction method of multipliers (ADMM), the output weights are evaluated iteratively. Simulation studies with comparisons on three benchmark datasets are carried out. The experimental results indicate that our proposed learning scheme performs well and outperforms distributed RVFL networks. (c) 2020 Elsevier Inc. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据