4.7 Article

A survey on XAI and natural language explanations

期刊

出版社

ELSEVIER SCI LTD
DOI: 10.1016/j.ipm.2022.103111

关键词

Explainable AI; Natural language explanations; Presentation methods

向作者/读者索取更多资源

The field of explainable artificial intelligence (XAI) has gained increasing importance in recent years. However, existing research often overlooks the role of natural language in generating explanations. This survey reviews 70 XAI papers published between 2006 and 2021 and evaluates their readiness in terms of natural language explanations. The results show that only a few recent studies have considered using natural language for communication with end users or implemented methods for generating natural language explanations.
The field of explainable artificial intelligence (XAI) is gaining increasing importance in recent years. As a consequence, several surveys have been published to explore the current state of the art on this topic. One aspect that seems to be overlooked by these works is the applied presentation methods and, specifically, the role of natural language in generating the final explanations. This survey reviews 70 XAI papers published between 2006 and 2021 and evaluates their readiness with respect to natural language explanations. Thus, together with a set of hierarchical criteria, we define a multi-criteria decision-making model. Finally, we conclude that only a handful of recent XAI works either considered natural language explanations to approach final users (see, e.g.,(Bennetot et al., 2021)) or implemented a method capable of generating such explanations.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据