4.7 Article

Interpretation of ensemble learning to predict water quality using explainable artificial intelligence

Journal

SCIENCE OF THE TOTAL ENVIRONMENT
Volume 832, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.scitotenv.2022.155070

Keywords

Algal management; Ensemble model; Machine learning; Water quality; XGBoost

Funding

  1. National Research Foundation of Korea (NRF) - Korea government (MSIT) [2020R1G1A1008377]
  2. Korea Environment Industry & Technology Institute (KEITI) - Korea Ministry of Environment (MOE) [RE202101055]
  3. National Research Foundation of Korea [2020R1G1A1008377] Funding Source: Korea Institute of Science & Technology Information (KISTI), National Science & Technology Information Service (NTIS)

Ask authors/readers for more resources

This study developed an XGBoost ensemble machine learning model to predict chlorophyll-a (Chl-a) concentration and explored the effect of input variable selection on model performance. Using explainable artificial intelligence (XAI) algorithms, the study provided interpretable analyses of model predictions. The results showed that selecting input variables based on the Shapley value (SHAP) algorithm improved model stability and helped reduce the cost of water quality analysis.
Algal bloom is a significant issue when managing water quality in freshwater; specifically, predicting the concentration of algae is essential to maintaining the safety of the drinking water supply system. The chlorophyll-a (Chl-a) concentration is a commonly used indicator to obtain an estimation of algal concentration. In this study, an XGBoost ensemble machine learning (ML) model was developed from eighteen input variables to predict Chl-a concentration. The composition and pretreatment of input variables to the model are important factors for improving model performance. Explainable artificial intelligence (XAI) is an emerging area of ML modeling that provides a reasonable interpretation of model performance. The effect of input variable selection on model performance was estimated, where the priority of input variable selection was determined using three indices: Shapley value (SHAP), feature importance (FI), and variance inflation factor (VIF). SHAP analysis is an XAI algorithm designed to compute the relative importance of input variables with consistency, providing an interpretable analysis for model prediction. The XGB models simulated with independent variables selected using three indices were evaluated with root mean square error (RMSE), RMSEobservation standard deviation ratio, and Nash-Sutcliffe efficiency. This study shows that the model exhibited the most stable performance when the priority of input variables was determined by SHAP. This implies that on-site monitoring can be designed to collect the selected input variables from the SHAP analysis to reduce the cost of overall water quality analysis. The independent variables were further analyzed using SHAP summary plot, force plot, target plot, and partial dependency plot to provide understandable interpretation on the performance of the XGB model. While XAI is still in the early stages of development, this study successfully demonstrated a good example of XAI application to improve the interpretation of machine learning model performance in predicting water quality.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available