4.6 Article

Retinal Lesion Detection With Deep Learning Using Image Patches

期刊

出版社

ASSOC RESEARCH VISION OPHTHALMOLOGY INC
DOI: 10.1167/iovs.17-22721

关键词

deep learning; detection; retina; machine learning; computer vision

资金

  1. National Cancer Institute, National Institutes of Health [U01CA142555, 1U01CA190214, 1U01CA187947]
  2. NATIONAL CANCER INSTITUTE [U01CA190214, U01CA142555, U01CA187947] Funding Source: NIH RePORTER
  3. NATIONAL LIBRARY OF MEDICINE [T15LM007033] Funding Source: NIH RePORTER

向作者/读者索取更多资源

PURPOSE. To develop an automated method of localizing and discerning multiple types of findings in retinal images using a limited set of training data without hard-coded feature extraction as a step toward generalizing these methods to rare disease detection in which a limited number of training data are available. METHODS. Two ophthalmologists verified 243 retinal images, labeling important subsections of the image to generate 1324 image patches containing either hemorrhages, microaneurysms, exudates, retinal neovascularization, or normal-appearing structures from the Kaggle dataset. These image patches were used to train one standard convolutional neural network to predict the presence of these five classes. A sliding window method was used to generate probability maps across the entire image. RESULTS. The method was validated on the eOphta dataset of 148 whole retinal images for microaneurysms and 47 for exudates. A pixel-wise classification of the area under the curve of the receiver operating characteristic of 0.94 and 0.95, as well as a lesion-wise area under the precision recall curve of 0.86 and 0.64, was achieved for microaneurysms and exudates, respectively. CONCLUSIONS. Regionally trained convolutional neural networks can generate lesion-specific probability maps able to detect and distinguish between subtle pathologic lesions with only a few hundred training examples per lesion.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据