4.7 Article

Interpretation of ensemble learning to predict water quality using explainable artificial intelligence

期刊

SCIENCE OF THE TOTAL ENVIRONMENT
卷 832, 期 -, 页码 -

出版社

ELSEVIER
DOI: 10.1016/j.scitotenv.2022.155070

关键词

Algal management; Ensemble model; Machine learning; Water quality; XGBoost

资金

  1. National Research Foundation of Korea (NRF) - Korea government (MSIT) [2020R1G1A1008377]
  2. Korea Environment Industry & Technology Institute (KEITI) - Korea Ministry of Environment (MOE) [RE202101055]
  3. National Research Foundation of Korea [2020R1G1A1008377] Funding Source: Korea Institute of Science & Technology Information (KISTI), National Science & Technology Information Service (NTIS)

向作者/读者索取更多资源

This study developed an XGBoost ensemble machine learning model to predict chlorophyll-a (Chl-a) concentration and explored the effect of input variable selection on model performance. Using explainable artificial intelligence (XAI) algorithms, the study provided interpretable analyses of model predictions. The results showed that selecting input variables based on the Shapley value (SHAP) algorithm improved model stability and helped reduce the cost of water quality analysis.
Algal bloom is a significant issue when managing water quality in freshwater; specifically, predicting the concentration of algae is essential to maintaining the safety of the drinking water supply system. The chlorophyll-a (Chl-a) concentration is a commonly used indicator to obtain an estimation of algal concentration. In this study, an XGBoost ensemble machine learning (ML) model was developed from eighteen input variables to predict Chl-a concentration. The composition and pretreatment of input variables to the model are important factors for improving model performance. Explainable artificial intelligence (XAI) is an emerging area of ML modeling that provides a reasonable interpretation of model performance. The effect of input variable selection on model performance was estimated, where the priority of input variable selection was determined using three indices: Shapley value (SHAP), feature importance (FI), and variance inflation factor (VIF). SHAP analysis is an XAI algorithm designed to compute the relative importance of input variables with consistency, providing an interpretable analysis for model prediction. The XGB models simulated with independent variables selected using three indices were evaluated with root mean square error (RMSE), RMSEobservation standard deviation ratio, and Nash-Sutcliffe efficiency. This study shows that the model exhibited the most stable performance when the priority of input variables was determined by SHAP. This implies that on-site monitoring can be designed to collect the selected input variables from the SHAP analysis to reduce the cost of overall water quality analysis. The independent variables were further analyzed using SHAP summary plot, force plot, target plot, and partial dependency plot to provide understandable interpretation on the performance of the XGB model. While XAI is still in the early stages of development, this study successfully demonstrated a good example of XAI application to improve the interpretation of machine learning model performance in predicting water quality.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据