3.8 Proceedings Paper

The application of deep learning framework in quantifying retinal structures on ophthalmic image in research eye-PACS

Publisher

SPIE-INT SOC OPTICAL ENGINEERING
DOI: 10.1117/12.2512458

Keywords

deep learning; ophthalmic photography; object detection; Mask R-CNN; Unet; instance segmentation

Funding

  1. National Key R&D Program of China [2018YFC1314902]
  2. National Natural Science Foundation of China [81501559, 81371663]
  3. Excellent Key Teachers in the Qing Lan Project of Jiangsu Colleges and Universities
  4. Graduate Research and Innovation Plan Project of Nantong University [YKC15056]

Ask authors/readers for more resources

The rise of deep learning (DL) framework and its application in object recognition could benefit image-based medical diagnosis. Since eye is believed to be a window into human health, the application of DL on differentiating abnormal ophthalmic photography (OP) will greatly empower ophthalmologists to relieve their workload for disease screening. In our previous work, we employed ResNet-50 to construct classification model for diabetic retinopathy(DR) within the PACS. In this study, we implemented latest DL object detection and semantic segmentation framework to empower the eye-PACS. Mask R-CNN framework was selected for object detection and instance segmentation of the optic disc (OD) and the macula. Furthermore, Unet framework was utilized for semantic segmentation of retinal vessel pixels from OP. The performance of the segmented results by two frameworks achieved state-of-art efficiency and the segmented results were transmitted to PACS as grayscale softcopy presentation state (GSPS) file. We also developed a prototype for OP quantitative analysis. It's believed that the implementation of DL framework into the object recognition and analysis on OPs is meaningful and worth further investigation.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available