期刊
MAGNETIC RESONANCE IN MEDICINE
卷 84, 期 2, 页码 663-685出版社
WILEY
DOI: 10.1002/mrm.28148
关键词
accelerated MRI; compressive sensing; deep learning; image reconstruction; transfer learning
资金
- Marie Curie Actions Career Integration grant [PCIG13-GA-2013-618101]
- European Molecular Biology Organization Installation grant [IG 3028]
- TUBA GEBIP fellowship
- TUBITAK 1001 grant [118E256]
- BAGEP fellowship
- NVIDIA Corporation
Purpose Neural networks have received recent interest for reconstruction of undersampled MR acquisitions. Ideally, network performance should be optimized by drawing the training and testing data from the same domain. In practice, however, large datasets comprising hundreds of subjects scanned under a common protocol are rare. The goal of this study is to introduce a transfer-learning approach to address the problem of data scarcity in training deep networks for accelerated MRI. Methods Neural networks were trained on thousands (upto 4 thousand) of samples from public datasets of either natural images or brain MR images. The networks were then fine-tuned using only tens of brain MR images in a distinct testing domain. Domain-transferred networks were compared to networks trained directly in the testing domain. Network performance was evaluated for varying acceleration factors (4-10), number of training samples (0.5-4k), and number of fine-tuning samples (0-100). Results The proposed approach achieves successful domain transfer between MR images acquired with different contrasts (T-1- and T-2-weighted images) and between natural and MR images (ImageNet and T-1- or T-2-weighted images). Networks obtained via transfer learning using only tens of images in the testing domain achieve nearly identical performance to networks trained directly in the testing domain using thousands (upto 4 thousand) of images. Conclusion The proposed approach might facilitate the use of neural networks for MRI reconstruction without the need for collection of extensive imaging datasets.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据