4.7 Article

Machine learning models to predict the tunnel wall convergence

期刊

TRANSPORTATION GEOTECHNICS
卷 41, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.trgeo.2023.101022

关键词

Tunnel wall convergence; Machine Learning; Jellyfish search optimizer; Prediction; Highway

向作者/读者索取更多资源

Accurate prediction of tunnel wall convergence was achieved by utilizing six popular and reliable machine learning models and 142 sets of highway tunnel convergence data. The JSO-RF model demonstrated superior predictive performance and identified phi rm, Erm, and H as important input variables.
Deformation and damage of the rock mass induced by excavation operations can result in the tunnel wall convergence. Accurate prediction of the convergence is crucial for ensuring safe, economical, and efficient construction of tunnel projects. In this study, six popular and reliable machine learning (ML) models, including back-propagation neural network (BPNN), general regression neural network (GRNN), extreme learning machine (ELM), kernel extreme learning machine (KELM), least squares support vector machine (LSSVR) and random forest (RF) were selected to predict the tunnel wall convergence. The jellyfish search optimizer (JSO) was utilized to find the optimal hyperparameters of these models. A total of 142 sets of highway tunnel convergence data were collected for model training and testing, six parameters including overburden thickness (H), rock mass rating index (RMR), rock mass quality index (Q), rock mass cohesion (Crm), deformation modulus of rock mass (Erm), and internal friction angle of rock mass (phi rm) were used as input variables and the mean convergence of tunnel wall (Y) was considered as the output variable. Mean absolute error (MAE), root mean squared error (RMSE), coefficient of determination (R2), and variance accounted for (VAF) as evaluation indicators were used to compare the predictive performance of the proposed models. The results from various evaluation methods demonstrate that the overall performance of the JSO-RF model is superior, with MAE, RMSE, R2, and VAF values of 4.288, 6.219, 0.917, and 89.480 during the training phase, and 3.141, 3.958, 0.939, and 93.164 during the testing phase. When compared with RF, multivariate regression (MVR), and empirical equations, the JSO-RF model also exhibits better predictive performance. The importance results of all input variables showed that phi rm, Erm, and H were more important input variables than other variables.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据