4.7 Article

A survey on XAI and natural language explanations

Journal

INFORMATION PROCESSING & MANAGEMENT
Volume 60, Issue 1, Pages -

Publisher

ELSEVIER SCI LTD
DOI: 10.1016/j.ipm.2022.103111

Keywords

Explainable AI; Natural language explanations; Presentation methods

Ask authors/readers for more resources

The field of explainable artificial intelligence (XAI) has gained increasing importance in recent years. However, existing research often overlooks the role of natural language in generating explanations. This survey reviews 70 XAI papers published between 2006 and 2021 and evaluates their readiness in terms of natural language explanations. The results show that only a few recent studies have considered using natural language for communication with end users or implemented methods for generating natural language explanations.
The field of explainable artificial intelligence (XAI) is gaining increasing importance in recent years. As a consequence, several surveys have been published to explore the current state of the art on this topic. One aspect that seems to be overlooked by these works is the applied presentation methods and, specifically, the role of natural language in generating the final explanations. This survey reviews 70 XAI papers published between 2006 and 2021 and evaluates their readiness with respect to natural language explanations. Thus, together with a set of hierarchical criteria, we define a multi-criteria decision-making model. Finally, we conclude that only a handful of recent XAI works either considered natural language explanations to approach final users (see, e.g.,(Bennetot et al., 2021)) or implemented a method capable of generating such explanations.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available