4.7 Article

Deep stacked stochastic configuration networks for lifelong learning of non-stationary data streams

Journal

INFORMATION SCIENCES
Volume 495, Issue -, Pages 150-174

Publisher

ELSEVIER SCIENCE INC
DOI: 10.1016/j.ins.2019.04.055

Keywords

Stochastic configuration networks; Deep learning; Non-stationary data streams

Funding

  1. Ministry of Education, Republic of Singapore, Tier 1 Research Grant
  2. NTU Start-up Grant

Ask authors/readers for more resources

The concept of SCN offers a fast framework with universal approximation guarantee for lifelong learning of non-stationary data streams. Its adaptive scope selection property enables for proper random generation of hidden unit parameters advancing conventional randomized approaches constrained with a fixed scope of random parameters. This paper proposes deep stacked stochastic configuration network (DSSCN) for continual learning of non-stationary data streams which contributes two major aspects: 1) DSSCN features a self-constructing methodology of deep stacked network structure where hidden unit and hidden layer are extracted automatically from continuously generated data streams; 2) the concept of SCN is developed to randomly assign inverse covariance matrix of multivariate Gaussian function in the hidden node addition step bypassing its computationally prohibitive tuning phase. Numerical evaluation and comparison with prominent data stream algorithms under two procedures: periodic hold-out and prequential test-then-train processes demonstrate the advantage of proposed methodology. (C) 2019 Elsevier Inc. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available