4.7 Article

Measuring mammographic density: comparing a fully automated volumetric assessment versus European radiologists' qualitative classification

Journal

EUROPEAN RADIOLOGY
Volume 26, Issue 12, Pages 4354-4360

Publisher

SPRINGER
DOI: 10.1007/s00330-016-4309-3

Keywords

Mammography; Screening; Diagnostic imaging; Breast cancer; Women' health

Funding

  1. Volpara
  2. Siemens AG (Erlangen, Germany)
  3. Siemens
  4. National Health Services, Research Foundation
  5. Swedish Cancer Society

Ask authors/readers for more resources

Breast Imaging-Reporting and Data System (BI-RADS) mammographic density categories are associated with considerable interobserver variability. Automated methods of measuring volumetric breast density may reduce variability and be valuable in risk and mammographic screening stratification. Our objective was to assess agreement of mammographic density by a volumetric method with the radiologists' classification. Eight thousand seven hundred and eighty-two examinations from the Malmo Breast Tomosynthesis Screening Trial were classified according to BI-RADS, 4th Edition. Volumetric breast density was assessed using automated software for 8433 examinations. Agreement between volumetric breast density and BI-RADS was descriptively analyzed. Agreement between radiologists and between categorical volumetric density and BI-RADS was calculated, rendering kappa values. The observed agreement between BI-RADS scores of different radiologists was 80.9 % [kappa 0.77 (0.76-0.79)]. A spread of volumetric breast density for each BI-RADS category was seen. The observed agreement between categorical volumetric density and BI-RADS scores was 57.1 % [kappa 0.55 (0.53-0.56)]. There was moderate agreement between volumetric density and BI-RADS scores from European radiologists indicating that radiologists evaluate mammographic density differently than software. The automated method may be a robust and valuable tool; however, differences in interpretation between radiologists and software require further investigation. aEuro cent Agreement between qualitative and software density measurements has not been frequently studied. aEuro cent There was substantial agreement between different radiologistsA ' qualitative density assessments. aEuro cent There was moderate agreement between software and radiologists' density assessments. aEuro cent Differences in interpretation between software and radiologists require further investigation.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available