Journal
ELECTRONICS
Volume 12, Issue 10, Pages -Publisher
MDPI
DOI: 10.3390/electronics12102215
Keywords
explainable AI (XAI); machine learning; fuzzy decision tree; LORE
Ask authors/readers for more resources
Multi-class classification is a fundamental task in Machine Learning, but complex models are often difficult to interpret. This paper presents a novel method called mcFuzzy-LORE for explaining decisions made by fuzzy-based classifiers. It uses fuzzy decision trees to provide human-readable rules that describe the reasoning behind the model's decision. The method was evaluated on a private dataset and outperformed prior methods in generating counterfactual instances.
Multi-class classification is a fundamental task in Machine Learning. However, complex models can be viewed as black boxes, making it difficult to gain insight into how the model makes its predictions and build trust in its decision-making process. This paper presents a novel method called Multi-Class Fuzzy-LORE (mcFuzzy-LORE) for explaining the decisions made by multi-class fuzzy-based classifiers such as Fuzzy Random Forests (FRF). mcFuzzy-LORE is an adaptation of the Fuzzy-LORE method that uses fuzzy decision trees as an alternative to classical decision trees, providing interpretable, human-readable rules that describe the reasoning behind the model's decision for a specific input. The proposed method was evaluated on a private dataset that was used to train an FRF-based multi-class classifier that assesses the risk of developing diabetic retinopathy in diabetic patients. The results show that mcFuzzy-LORE outperforms prior classical LORE-based methods in the generation of counterfactual instances.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available