Journal
ARTIFICIAL INTELLIGENCE IN MEDICINE
Volume 124, Issue -, Pages -Publisher
ELSEVIER
DOI: 10.1016/j.artmed.2021.102158
Keywords
Xai; Black-box; Ethics; Challenges; Transparency; Autonomy
Ask authors/readers for more resources
This article uses the analogy of the three Christmas ghosts to guide readers through the past, present, and future of medical AI. It highlights the reliance on opaque models in modern machine learning and discusses the implications for transparency in healthcare. The article argues that opaque models lack quality assurance, trust, and hinder physician-patient dialogue, and suggests upholding transparency in model design and validation to ensure the success of medical AI.
Our title alludes to the three Christmas ghosts encountered by Ebenezer Scrooge in A Christmas Carol, who guide Ebenezer through the past, present, and future of Christmas holiday events. Similarly, our article takes readers through a journey of the past, present, and future of medical AI. In doing so, we focus on the crux of modern machine learning: the reliance on powerful but intrinsically opaque models. When applied to the healthcare domain, these models fail to meet the needs for transparency that their clinician and patient end-users require. We review the implications of this failure, and argue that opaque models (1) lack quality assurance, (2) fail to elicit trust, and (3) restrict physician-patient dialogue. We then discuss how upholding transparency in all aspects of model design and model validation can help ensure the reliability and success of medical AI.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available