4.7 Article

Learning Hierarchical Attention for Weakly-Supervised Chest X-Ray Abnormality Localization and Diagnosis

Journal

IEEE TRANSACTIONS ON MEDICAL IMAGING
Volume 40, Issue 10, Pages 2698-2710

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TMI.2020.3042773

Keywords

Location awareness; Annotations; Task analysis; X-ray imaging; Visualization; Diseases; Image analysis; Weakly supervised; abnormality localization; explainability; hierarchical attention

Funding

  1. National Key Research and Development Program of China [2018YFC0116400]
  2. Shanghai Strategic Emerging Industries from Shanghai Municipal Development and Reform Commission [20191211]
  3. Science and Technology Commission of Shanghai Municipality (STCSM) [19QC1400600]

Ask authors/readers for more resources

This work addresses the problem of abnormality localization in clinical applications using a new attention-driven weakly supervised algorithm that combines activation- and gradient-based visual attention in a holistic manner. Key algorithmic innovations include explicit ordinal attention constraints for model training in a weakly-supervised fashion and the generation of visual-attention-driven model explanations through localization cues. The proposed method demonstrates significant improvements in localization performance on two large-scale chest X-ray datasets while maintaining competitive classification performance.
We consider the problem of abnormality localization for clinical applications. While deep learning has driven much recent progress in medical imaging, many clinical challenges are not fully addressed, limiting its broader usage. While recent methods report high diagnostic accuracies, physicians have concerns trusting these algorithm results for diagnostic decision-making purposes because of a general lack of algorithm decision reasoning and interpretability. One potential way to address this problem is to further train these models to localize abnormalities in addition to just classifying them. However, doing this accurately will require a large amount of disease localization annotations by clinical experts, a task that is prohibitively expensive to accomplish for most applications. In this work, we take a step towards addressing these issues by means of a new attention-driven weakly supervised algorithm comprising a hierarchical attention mining framework that unifies activation- and gradient-based visual attention in a holistic manner. Our key algorithmic innovations include the design of explicit ordinal attention constraints, enabling principled model training in a weakly-supervised fashion, while also facilitating the generation of visual-attention-driven model explanations by means of localization cues. On two large-scale chest X-ray datasets (NIH ChestX-ray14 and CheXpert), we demonstrate significant localization performance improvements over the current state of the art while also achieving competitive classification performance.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available