4.7 Article

A Fast Reduced Kernel Extreme Learning Machine

Journal

NEURAL NETWORKS
Volume 76, Issue -, Pages 29-38

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neunet.2015.10.006

Keywords

Extreme learning machine; Kernel method; Support vector machine; RBF network

Funding

  1. National Research Foundation (NRF) Singapore under the Corp Lab@University Scheme
  2. Innovation fund research group [61221063]
  3. National Science Foundation of China [61572399, 61532015]
  4. Shaanxi New Star of Science Technology [2013 KJXX-29]
  5. New Star Team of Xian University of Posts AMP
  6. Telecommunications
  7. Provincial Key Disciplines Construction Fund of General Institutions of Higher Education in Shaanxi

Ask authors/readers for more resources

In this paper, we present a fast and accurate kernel-based supervised algorithm referred to as the Reduced Kernel Extreme Learning Machine (RKELM). In contrast to the work on Support Vector Machine (SVM) or Least Square SVM (LS-SVM), which identifies the support vectors or weight vectors iteratively, the proposed RKELM randomly selects a subset of the available data samples as support vectors (or mapping samples). By avoiding the iterative steps of SVM, significant cost savings in the training process can be readily attained, especially on Big datasets. RKELM is established based on the rigorous proof of universal learning involving reduced kernel-based SLFN. In particular, we prove that RKELM can approximate any nonlinear functions accurately under the condition of support vectors sufficiency. Experimental results on a wide variety of real world small instance size and large instance size applications in the context of binary classification, multi-class problem and regression are then reported to show that RKELM can perform at competitive level of generalized performance as the SVM/LS-SVM at only a fraction of the computational effort incurred. (C) 2015 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available