TY - JOUR T1 - Automatic Method to Assess Local CT–MR Imaging Registration Accuracy on Images of the Head JF - American Journal of Neuroradiology JO - Am. J. Neuroradiol. SP - 137 LP - 144 VL - 26 IS - 1 AU - Ion P. I. Pappas AU - Martin Styner AU - Puja Malik AU - Luca Remonda AU - Marco Caversaccio Y1 - 2005/01/01 UR - http://www.ajnr.org/content/26/1/137.abstract N2 - BACKGROUND AND PURPOSE: Precise registration of CT and MR images is crucial in many clinical cases for proper diagnosis, decision making or navigation in surgical interventions. Various algorithms can be used to register CT and MR datasets, but prior to clinical use the result must be validated. To evaluate the registration result by visual inspection is tiring and time-consuming. We propose a new automatic registration assessment method, which provides the user a color-coded fused representation of the CT and MR images, and indicates the location and extent of poor registration accuracy.METHODS: The method for local assessment of CT–MR registration is based on segmentation of bone structures in the CT and MR images, followed by a voxel correspondence analysis. The result is represented as a color-coded overlay. The algorithm was tested on simulated and real datasets with different levels of noise and intensity non-uniformity.RESULTS: Based on tests on simulated MR imaging data, it was found that the algorithm was robust for noise levels up to 7% and intensity non-uniformities up to 20% of the full intensity scale. Due to the inability to distinguish clearly between bone and cerebro-spinal fluids in the MR image (T1-weighted), the algorithm was found to be optimistic in the sense that a number of voxels are classified as well-registered although they should not. However, nearly all voxels classified as misregistered are correctly classified.CONCLUSION: The proposed algorithm offers a new way to automatically assess the CT–MR image registration accuracy locally in all the areas of the volume that contain bone and to represent the result with a user-friendly, intuitive color-coded overlay on the fused dataset. ER -