4.7 Article

A comparison of publicly available linear MRI stereotaxic registration techniques

Journal

NEUROIMAGE
Volume 174, Issue -, Pages 191-200

Publisher

ACADEMIC PRESS INC ELSEVIER SCIENCE
DOI: 10.1016/j.neuroimage.2018.03.025

Keywords

MRI; Linear registration; Quality control

Funding

  1. National Institutes of Health [U01 AG024904]
  2. National Institute on Aging
  3. National Institute of Biomedical Imaging and Bioengineering
  4. Michael J Fox Foundation
  5. AbbVie
  6. Avid Radiopharmaceuticals
  7. Biogen
  8. Bristol-Myers Squibb
  9. Covance
  10. GE Healthcare
  11. Genentech
  12. GlaxoSmithKline (GSK)
  13. Eli Lilly and Company
  14. Lundbeck
  15. Merck
  16. Meso Scale Discovery (MSD)
  17. Pfizer
  18. Piramal Imaging
  19. Roche
  20. Servier
  21. UCB
  22. NIH Blueprint for Neuroscience Research [1U54MH091657]
  23. McDonnell Center for Systems Neuroscience at Washington University
  24. McGill University
  25. Fonds de Research du Quebec-Sante
  26. Douglas Hospital Research Centre and Foundation
  27. Government of Canada
  28. Canadian Foundation for Innovation
  29. Levesque Foundation
  30. Famille Louise Andre Charron

Ask authors/readers for more resources

Introduction: Linear registration to a standard space is one of the major steps in processing and analyzing magnetic resonance images (MRIs) of the brain. Here we present an overview of linear stereotaxic MRI registration and compare the performance of 5 publicly available and extensively used linear registration techniques in medical image analysis. Methods: A set of 9693 T1-weighted MR images were obtained for testing from 4 datasets: ADNI, PREVENT-AD, PPMI, and HCP, two of which have multi-center and multi-scanner data and three of which have longitudinal data. Each individual native image was linearly registered to the MNI ICBM152 average template using five versions of MRITOTAL from MINC tools, FLIRT from FSL, two versions of Elastix, spm_affreg from SPM, and ANTs linear registration techniques. Quality control (QC) images were generated from the registered volumes and viewed by an expert rater to assess the quality of the registrations. The QC image contained 60 sub-images (20 of each of axial, sagittal, and coronal views at different levels throughout the brain) overlaid with contours of the ICBM152 template, enabling the expert rater to label the registration as acceptable or unacceptable. The performance of the registration techniques was then compared across different datasets. In addition, the effect of image noise, intensity non-uniformity, age, head size, and atrophy on the performance of the techniques was investigated by comparing differences between age, scaling factor, ventricle volume, brain volume, and white matter hyperintensity (WMH) volumes between passed and failed cases for each method. Results: The average registration failure rate among all datasets was 27.41%, 27.14%, 12.74%, 13.03%, 0.44% for the five versions of MRITOTAL techniques, 8.87% for ANTs, 11.11% for FSL, 12.35% for Elastix Affine, 24.40% for Elastix Similarity, and 30.66% for SPM. There were significant effects of signal to noise ratio, image intensity non-uniformity estimates, as well as age, head size, and atrophy related changes between passed and failed registrations. Conclusion: Our experiments show that the Revised BestLinReg had the best performance among the evaluated registration techniques while all techniques performed worse for images with higher levels of noise and non-uniformity as well as atrophy related changes.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available