4.6 Article

A semi-supervised online sequential extreme learning machine method

期刊

NEUROCOMPUTING
卷 174, 期 -, 页码 168-178

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2015.04.102

关键词

Online Sequential ELM (OS-ELM); Semi-supervised ELM (SS-ELM); Semi-supervised online sequential ELM (SOS-ELM)

资金

  1. Natural Science Foundation of China [61375059, 61175115]
  2. Beijing Natural Science Foundation [4122004, 4152005]
  3. Specialized Research Fund for the Doctoral Program of Higher Education [20121103110031]
  4. Importation and the Importation and Development of High-Caliber Talents Project of Beijing Municipal Institutions [CITTCD201304035]
  5. Special training program for construction of teachers of Beijing High education - abroad training program for Senior visiting scholars by Beijing high education teacher training center [067145301400]
  6. Jing-Hua Talents Project of Beijing University of Technology [2014-JH-L06]
  7. International Communication Ability Development Plan for Young Teachers of Beijing University of Technology [2014-16]

向作者/读者索取更多资源

This paper proposes a learning algorithm called Semi-supervised Online Sequential ELM, denoted as SOS-ELM. It aims to provide a solution for streaming data applications by learning from just the newly arrived observations, called a chunk. In addition, SOS-ELM can utilize both labeled and unlabeled training data by combining the advantages of two existing algorithms: Online Sequential ELM (OS-ELM) and Semi-Supervised ELM (SS-ELM). The rationale behind our algorithm exploits an optimal condition to alleviate empirical risk and structure risk used by SS-ELM, in combination with block calculation of matrices similar to OS-ELM. Efficient implementation of the SOS-ELM algorithm is made viable by an additional assumption that there is negligible structural relationship between chunks from different times. Experiments have been performed on standard benchmark problems for regression, balanced binary classification, unbalanced binary classification and multi-class classification by comparing the performance of the proposed SOS-ELM with OS-ELM and SS-ELM. The experimental results show that the SOS-ELM outperforms OS-ELM in generalization performance with similar training speed, and in addition outperforms SS-ELM with much lower supervision overheads. (C) 2015 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据