4.7 Article

Enhanced RBF neural network metamodelling approach assisted by sliced splitting-based K-fold cross-validation and its application for the stiffened cylindrical shells

Journal

AEROSPACE SCIENCE AND TECHNOLOGY
Volume 124, Issue -, Pages -

Publisher

ELSEVIER FRANCE-EDITIONS SCIENTIFIQUES MEDICALES ELSEVIER
DOI: 10.1016/j.ast.2022.107534

Keywords

RBFNN; Width parameters; Sliced splitting strategy; K-fold cross-validation; SSKCV; Stiffened cylindricalshells

Funding

  1. National Key R&D Program of China [2017YFB0306200]
  2. National Natural Science Foundation of China [11902348]
  3. Research Project of the National University of Defense Technology [ZK19-11, ZK20-27]

Ask authors/readers for more resources

This paper introduces a novel sliced splitting-based K-fold cross-validation (SSKCV) method to construct an improved radial basis function neural network (RBFNN) metamodel with enhanced generalization capabilities. The SSKCV method overcomes the high variance and loss of information in observed sample points, and the introduction of average expected prediction error (AEPE) as the loss function further evaluates the generalization error of the RBFNN metamodel. The SSKCV method has been demonstrated to achieve excellent generalization performance in high dimensional numerical benchmarks and stiffened cylindrical shells, outperforming other SSKCV variants and blind Kriging.
To build an enhanced radial basis function neural network (RBFNN) metamodel with improved generalization capabilities, this paper presents a novel sliced splitting-based K-fold cross-validation (SSKCV) method. Cross-validation is a promising method to construct the RBFNN metamodel but suffers the high variance and loss of information of observed sample points. To overcome the intrinsic deficiency, a sliced splitting strategy is proposed to allocate the observed sample points into K mutually exclusive and collectively exhaustive folds as evenly as possible. Further, the novel average expected prediction error (AEPE), analogous to a bias-variance tradeoff, is introduced as the loss function in SSKCV, which is more capable to evaluate the generalization error of the RBFNN metamodel. Finally, the optimal parameters in the RBFNN metamodels are determined by the SSKCV method, which enhances the metamodelling efficiency and precision. Compared with other variants of SSKCV and the state-of-art blind Kriging, the benefits of the SSKCV method in achieving excellent generalization performance are both validated in the high dimensional numerical benchmarks and the stiffened cylindrical shells. (c) 2022 Elsevier Masson SAS. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available