4.7 Article

Robust boosting neural networks with random weights for multivariate calibration of complex samples

Journal

ANALYTICA CHIMICA ACTA
Volume 1009, Issue -, Pages 20-26

Publisher

ELSEVIER SCIENCE BV
DOI: 10.1016/j.aca.2018.01.013

Keywords

Ensemble modeling; Boosting; Neural networks with random weights; Extreme learning machine; Outlier; Complex samples

Funding

  1. National Basic Research Program of China [2014CB660813]
  2. National Natural Science Foundation of China [21405110, 21603160, 21676199]

Ask authors/readers for more resources

Neural networks with random weights (NNRW) has been used for regression due to its excellent performance. However, NNRW is sensitive to outliers and unstable to some extent in dealing with the real-world complex samples. To overcome these drawbacks, a new method called robust boosting NNRW (RBNNRW) is proposed by integrating a robust version of boosting with NNRW. The method builds a large number of NNRW sub-models sequentially by robustly reweighted sampling from the original training set and then aggregates these predictions by weighted median. The performance of RBNNRW is tested with three spectral datasets of wheat, light gas oil and diesel fuel samples. As comparisons to RBNNRW, the conventional PLS, NNRW and boosting NNRW (BNNRW) have also been investigated. The results demonstrate that the introduction of robust boosting greatly enhances the stability and accuracy of NNRW. Moreover, RBNNRW is superior to BNNRW particularly when outliers exist. (c) 2018 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available