4.6 Article

Spatial registration of serial microscopic brain images to three-dimensional reference atlases with the QuickNII tool

Journal

PLOS ONE
Volume 14, Issue 5, Pages -

Publisher

PUBLIC LIBRARY SCIENCE
DOI: 10.1371/journal.pone.0216796

Keywords

-

Funding

  1. European Union's Horizon 2020 Research and Innovation Programme [720270, 785907]
  2. Research Council of Norway [269774]

Ask authors/readers for more resources

Modern high throughput brain wide profiling techniques for cells and their morphology, connectivity, and other properties, make the use of reference atlases with 3D coordinate frameworks essential. However, anatomical location of observations made in microscopic sectional images from rodent brains is typically determined by comparison with 2D anatomical reference atlases. A major challenge in this regard is that microscopic sections often are cut with orientations deviating from the standard planes used in the reference atlases, resulting in inaccuracies and a need for tedious correction steps. Overall, efficient tools for registration of large series of section images to reference atlases are currently not widely available. Here we present QuickNII, a stand-alone software tool for semi-automated affine spatial registration of sectional image data to a 3D reference atlas coordinate framework. A key feature in the tool is the capability to generate user defined cut planes through the reference atlas, matching the orientation of the cut plane of the sectional image data. The reference atlas is transformed to match anatomical landmarks in the corresponding experimental images. In this way, the spatial relationship between experimental image and atlas is defined, without introducing distortions in the original experimental images. Following anchoring of a limited number of sections containing key landmarks, transformations are propagated across the entire series of sectional images to reduce the amount of manual steps required. By having coordinates assigned to the experimental images, further analysis of the distribution of features extracted from the images is greatly facilitated.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available