4.6 Article

Explainable AI to Predict Male Fertility Using Extreme Gradient Boosting Algorithm with SMOTE

Journal

ELECTRONICS
Volume 12, Issue 1, Pages -

Publisher

MDPI
DOI: 10.3390/electronics12010015

Keywords

explainability techniques; extreme gradient boosting (XGB); SMOTE; male fertility

Ask authors/readers for more resources

Infertility is a global issue, with male factors contributing to about 40% to 50% of cases. Existing AI systems lack interpretability, limiting clinicians' understanding of decision-making processes and their application in healthcare. This study introduces an explainable model for male fertility prediction, utilizing nine features related to lifestyle and environmental factors. Five AI tools (support vector machine, adaptive boosting, XGB, random forest, and extra tree algorithms) are deployed with balanced and imbalanced datasets, incorporating explainable AI techniques such as LIME, SHAP, and ELI5. XGB outperformed existing AI systems, achieving an optimal AUC of 0.98.
Infertility is a common problem across the world. Infertility distribution due to male factors ranges from 40% to 50%. Existing artificial intelligence (AI) systems are not often human interpretable. Further, clinicians are unaware of how data analytical tools make decisions, and as a result, they have limited exposure to healthcare. Using explainable AI tools makes AI systems transparent and traceable, enhancing users' trust and confidence in decision-making. The main contribution of this study is to introduce an explainable model for investigating male fertility prediction. Nine features related to lifestyle and environmental factors are utilized to develop a male fertility prediction model. Five AI tools, namely support vector machine, adaptive boosting, conventional extreme gradient boost (XGB), random forest, and extra tree algorithms are deployed with a balanced and imbalanced dataset. To produce our model in a trustworthy way, an explainable AI is applied. The techniques are (1) local interpretable model-agnostic explanations (LIME) and (2) Shapley additive explanations (SHAP). Additionally, ELI5 is utilized to inspect the feature's importance. Finally, XGB outperformed and obtained an AUC of 0.98, which is optimal compared to existing AI systems.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available