4.5 Article

Deep Learning for Real -time, Automatic, and Scanner-adapted Prostate (Zone) Segmentation of Transrectal Ultrasound, for Example, Magnetic Resonance Imaging ?transrectal Ultrasound

Journal

EUROPEAN UROLOGY FOCUS
Volume 7, Issue 1, Pages 78-85

Publisher

ELSEVIER
DOI: 10.1016/j.euf.2019.04.009

Keywords

Deep learning; Prostate cancer; Segmentation; Ultrasound magnetic resonance imaging? transrectal ultrasound fusion biopsy

Funding

  1. Dutch Cancer Society [UVA2013-5941]
  2. Massimo Mischi: European Research Council Starting Grant [280209]
  3. Astellas Pharma Netherlands B.V.
  4. European Research Council (ERC) [280209] Funding Source: European Research Council (ERC)

Ask authors/readers for more resources

This study utilized deep learning to achieve automatic, real-time prostate segmentation on TRUS images, demonstrating high accuracy and robust performance on different ultrasound scanners. The algorithm showed strong correlation between its self-assessment of segmentation performance and actual performance, enabling swift identification of potential errors.
Background: Although recent advances in multiparametric magnetic resonance imag-ing (MRI) led to an increase in MRI-transrectal ultrasound (TRUS) fusion prostate biopsies, these are time consuming, laborious, and costly. Introduction of deep-learning approach would improve prostate segmentation. Objective: To exploit deep learning to perform automatic, real-time prostate (zone) segmentation on TRUS images from different scanners. Design, setting, and participants: Three datasets with TRUS images were collected at different institutions, using an iU22 (Philips Healthcare, Bothell, WA, USA), a Pro Focus 2202a (BK Medical), and an Aixplorer (SuperSonic Imagine, Aix-en-Provence, France) ultrasound scanner. The datasets contained 436 images from 181 men. Outcome measurements and statistical analysis: Manual delineations from an expert panel were used as ground truth. The (zonal) segmentation performance was evaluated in terms of the pixel-wise accuracy, Jaccard index, and Hausdorff distance. Results and limitations: The developed deep-learning approach was demonstrated to significantly improve prostate segmentation compared with a conventional automated technique, reaching median accuracy of 98% (95% confidence interval 95-99%), a Jaccard index of 0.93 (0.80-0.96), anda Hausdorff distance of 3.0 (1.3-8.7) mm. Zonal segmentation yielded pixel-wise accuracy of 97% (95-99%) and 98% (96-99%) for the peripheral and transition zones, respectively. Supervised domain adaptation resulted in retainment of high performance when applied to images from different ultrasound scanners (p > 0.05). Moreover, the algorithm's assessment of its own segmentation performance showed a strong correlation with the actual segmentation performance (Pearson's correlation 0.72, p < 0.001), indicating that possible incorrect segmentations can be identified swiftly. Conclusions: Fusion-guided prostate biopsies, targeting suspicious lesions on MRI using TRUS are increasingly performed. The requirement for (semi)manual prostate delineation places a substantial burden on clinicians. Deep learning provides a means for fast and accurate (zonal) prostate segmentation of TRUS images that translates to different scanners.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available