4.6 Article

Distributed deep learning networks among institutions for medical imaging

出版社

OXFORD UNIV PRESS
DOI: 10.1093/jamia/ocy017

关键词

deep learning; neural networks; distributed learning; medical imaging

资金

  1. National Institutes of Health Blueprint for Neuroscience Research [T90DA022759/R90DA023427]
  2. National Institute of Biomedical Imaging and Bioengineering, National Institutes of Health [P41EB015896]
  3. National Institutes of Health [U01CA154601, U24CA180927, U24CA180918, U01CA190214, U01CA187947]

向作者/读者索取更多资源

Objective: Deep learning has become a promising approach for automated support for clinical diagnosis. When medical data samples are limited, collaboration among multiple institutions is necessary to achieve high algorithm performance. However, sharing patient data often has limitations due to technical, legal, or ethical concerns. In this study, we propose methods of distributing deep learning models as an attractive alternative to sharing patient data. Methods: We simulate the distribution of deep learning models across 4 institutions using various training heuristics and compare the results with a deep learning model trained on centrally hosted patient data. The training heuristics investigated include ensembling single institution models, single weight transfer, and cyclical weight transfer. We evaluated these approaches for image classification in 3 independent image collections (retinal fundus photos, mammography, and ImageNet). Results: We find that cyclical weight transfer resulted in a performance that was comparable to that of centrally hosted patient data. We also found that there is an improvement in the performance of cyclical weight transfer heuristic with a high frequency of weight transfer. Conclusions: We show that distributing deep learning models is an effective alternative to sharing patient data. This finding has implications for any collaborative deep learning study.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据