Journal
BIOMEDICAL SIGNAL PROCESSING AND CONTROL
Volume 79, Issue -, Pages -Publisher
ELSEVIER SCI LTD
DOI: 10.1016/j.bspc.2022.104144
Keywords
Drug recommendation; Explainability; Traceability; Omic data
Categories
Ask authors/readers for more resources
This research analyzes the importance of explainable AI drug recommendation and proposes a traceability rate as an evaluation metric, which shows a trade-off between model performance and explainability.
The application of Artificial Intelligence (AI) on cancer drug recommendation can prompt the development of personalized cancer therapy. However, most of the current AI drug recommendations cannot give explainable inferences, where their prediction procedures are black boxes, and are difficult to earn the trust of doctors or patients. In explainable inference, the key steps during the recommendation procedures can be located easily, facilitating model adjustment for wrong predictions and model generalization for new drugs/samples. In this paper, we analyze the necessity of developing explainable AI drug recommendation, and propose an evaluation metric called traceability rate. The traceability rate is calculated as the proportion of correct predictions that are traceable along the knowledge graph in all the ground truths. We further conduct an experiment on a benchmark drug response dataset to apply the traceability rate as evaluation metric, where the results show a trade-off between model performance and explainability. Therefore, the explainable AI drug recommendation still demands for further improvement to meet the requirement of clinical personalized therapy.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available