4.7 Article

Generalized Single-Hidden Layer Feedforward Networks for Regression Problems

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNNLS.2014.2334366

Keywords

Approximation capability; extreme learning machine (ELM); generalized single-hidden layer feedforward networks (GSLFN); polynomial output weights; ridge regression

Funding

  1. National Natural Science Foundation of China [51009017, 51379002, 61074096]
  2. Applied Basic Research Funds through the Ministry of Transport of China [2012-329-225-060]
  3. China Post-Doctoral Science Foundation [2012M520629]
  4. Program for Liaoning Excellent Talents in University [LJQ2013055]
  5. Fundamental Research Funds for the Central Universities of China [2009QN025, 2011JC002, 3132013025]

Ask authors/readers for more resources

In this paper, traditional single-hidden layer feedforward network (SLFN) is extended to novel generalized SLFN (GSLFN) by employing polynomial functions of inputs as output weights connecting randomly generated hidden units with corresponding output nodes. The significant contributions of this paper are as follows: 1) a primal GSLFN (P-GSLFN) is implemented using randomly generated hidden nodes and polynomial output weights whereby the regression matrix is augmented by full or partial input variables and only polynomial coefficients are to be estimated; 2) a simplified GSLFN (S-GSLFN) is realized by decomposing the polynomial output weights of the P-GSLFN into randomly generated polynomial nodes and tunable output weights; 3) both P-and S-GSLFN are able to achieve universal approximation if the output weights are tuned by ridge regression estimators; and 4) by virtue of the developed batch and online sequential ridge ELM (BR-ELM and OSR-ELM) learning algorithms, high performance of the proposed GSLFNs in terms of generalization and learning speed is guaranteed. Comprehensive simulation studies and comparisons with standard SLFNs are carried out on real-world regression benchmark data sets. Simulation results demonstrate that the innovative GSLFNs using BR-ELM and OSR-ELM are superior to standard SLFNs in terms of accuracy, training speed, and structure compactness.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available