4.0 Article

DeepLesion: automated mining of large-scale lesion annotations and universal lesion detection with deep learning

Journal

JOURNAL OF MEDICAL IMAGING
Volume 5, Issue 3, Pages -

Publisher

SPIE-SOC PHOTO-OPTICAL INSTRUMENTATION ENGINEERS
DOI: 10.1117/1.JMI.5.3.036501

Keywords

medical image dataset; lesion detection; convolutional neural network; deep learning; picture archiving and communication system; bookmark

Funding

  1. Intramural Research Program of the National Institutes of Health, Clinical Center

Ask authors/readers for more resources

Extracting, harvesting, and building large-scale annotated radiological image datasets is a greatly important yet challenging problem. Meanwhile, vast amounts of clinical annotations have been collected and stored in hospitals' picture archiving and communication systems (PACS). These types of annotations, also known as bookmarks in PACS, are usually marked by radiologists during their daily workflow to highlight significant image findings that may serve as reference for later studies. We propose to mine and harvest these abundant retrospective medical data to build a large-scale lesion image dataset. Our process is scalable and requires minimum manual annotation effort. We mine bookmarks in our institute to develop DeepLesion, a dataset with 32,735 lesions in 32,120 CT slices from 10,594 studies of 4,427 unique patients. There are a variety of lesion types in this dataset, such as lung nodules, liver tumors, enlarged lymph nodes, and so on. It has the potential to be used in various medical image applications. Using DeepLesion, we train a universal lesion detector that can find all types of lesions with one unified framework. In this challenging task, the proposed lesion detector achieves a sensitivity of 81.1% with five false positives per image. (C) 2018 Society of Photo-Optical Instrumentation Engineers (SPIE)

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.0
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available