3.8 Proceedings Paper

EXPLAINABLE AI FOR COVID-19 CT CLASSIFIERS: AN INITIAL COMPARISON STUDY

出版社

IEEE
DOI: 10.1109/CBMS52027.2021.00103

关键词

COVID-19; Explainable AI; Deep Learning; Classification; CT

向作者/读者索取更多资源

Research shows that deep learning and artificial intelligence have made significant advancements in various industries, but the mystery of decision-making processes by deep learning algorithms and AI tools remains. XAI is a crucial key to unlocking the black-box of AI and deep learning, enabling users to understand the goals and decision logic of AI models. The new study aims to propose XAI strategies to support COVID-19 classification models for enhanced clinician understanding of results.
Artificial Intelligence (AI) has made leapfrogs in development across all the industrial sectors especially when deep learning has been introduced. Deep learning helps to learn the behaviour of an entity through methods of recognising and interpreting patterns. Despite its limitless potential, the mystery is how deep learning algorithms make a decision in the first place. Explainable AI (XAI) is the key to unlocking AI and the black-box for deep learning. XAI is an AI model that is programmed to explain its goals, logic, and decision making so that the end users can understand. The end users can be domain experts, regulatory agencies, managers and executive board members, data scientists, users that use AI, with or without awareness, or someone who is affected by the decisions of an AI model. Chest CT has emerged as a valuable tool for the clinical diagnostic and treatment management of the lung diseases associated with COVID-19. AI can support rapid evaluation of CT scans to differentiate COVID-19 findings from other lung diseases. However, how these AI tools or deep learning algorithms reach such a decision and which are the most influential features derived from these neural networks with typically deep layers are not clear. The aim of this study is to propose and develop XAI strategies for COVID-19 classification models with an investigation of comparison. The results demonstrate promising quantification and qualitative visualisations that can further enhance the clinician's understanding and decision making with more granular information from the results given by the learned XAI models.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

3.8
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据