4.5 Editorial Material

Deep Learning Algorithms for Interpretation of Upper Extremity Radiographs: Laterality and Technologist Initial Labels as Confounding Factors

Journal

AMERICAN JOURNAL OF ROENTGENOLOGY
Volume 218, Issue 4, Pages 714-715

Publisher

AMER ROENTGEN RAY SOC
DOI: 10.2214/AJR.21.26882

Keywords

-

Ask authors/readers for more resources

Convolutional neural networks trained to identify abnormalities on upper extremity radiographs achieved an AUC of 0.844, with emphasis on radiograph laterality and/or technologist labels. Covering the labels increased the AUC to 0.857 (p = .02) and redirected CNN attention to bones. Using images of radiograph labels alone had an AUC of 0.638, indicating their association with abnormal examinations. Consideration should be given to potential radiographic confounding features when curating data for radiology CNN development.
Convolutional neural networks (CNNs) trained to identify abnormalities on upper extremity radiographs achieved an AUC of 0.844 with a frequent emphasis on radiograph laterality and/or technologist labels for decision-making. Covering the labels increased the AUC to 0.857 (p = .02) and redirected CNN attention from the labels to the bones. Using images of radiograph labels alone, the AUC was 0.638, indicating that radiograph labels are associated with abnormal examinations. Potential radiographic confounding features should be considered when curating data for radiology CNN development.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available