3.8 Proceedings Paper

Classification of Bacterial and Viral Childhood Pneumonia Using Deep Learning in Chest Radiography

Publisher

ASSOC COMPUTING MACHINERY
DOI: 10.1145/3195588.3195597

Keywords

CAD system; deep learning; chest X-rays; fully convolutional networks; convolutional neural networks

Funding

  1. National Natural Science Foundation of China [61273043]
  2. Natural Science Foundation of Guangdong
  3. Fundamental Research Funds for the Central Universities of China

Ask authors/readers for more resources

Over decades, computer aided diagnosis (CAD) system has been investigated for detection of lung diseases based on chest X-ray images. Incited by the great success of deep learning, in this work, we propose a novel CAD system to identify bacterial and viral pneumonia in chest radiography. The method consists of two parts, lung regions identification and pneumonia category classification. First, left and right lung regions are segmented and extracted with a fully convolutional networks (FCN) model. The model is trained and tested on the open Japanese society of radiological technology database (JSRT, 241 images) and Montgomery County, Md (MC, 138 images) dataset. After segmentation, a deep convolutional neural network (DCNN) model is used to classify the target lung regions. Then, based on the DCNN model, features of the target lung regions are extracted automatically and the performance is compared with that of manual features. Finally, the DCNN features and manual features are fused together and are put into support vector machines (SVM) classifier for binary classification. The proposed method is evaluated on a dataset of Guangzhou Women and Children's Medical Center, China, with 4,513 pediatric patients in total, aged from 1 to 9 years old, during the period from 2003 to 2017. The performances are measured by different criteria: accuracy, precision, sensitivity, specificity and area under the curve (AUC), which is a comprehensive criterion. The experimental results showed better accuracy (0.8048 +/- 0.0202) and sensitivity (0.7755 +/- 0.0296) in extracting features by DCNN with transfer learning. The values of AUC varied from 0.6937 to 0.8234. And an ensemble of different kinds of features slightly improved the AUC value from 0.8160 +/- 0.0162 to 0.8234 +/- 0.0014.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available