Multi-modality alignment of CT and ultrasound adds value to diagnostic examinations, as well as treatment planning and execution of various clinical procedures. Particularly automatic image-based alignment of such data is challenging, mostly because both modalities have very different imaging physics and characteristics. We present a method for dense-field deformable registration of CT and 3D ultrasound. Compared to global (rigid) alignment, this is more difficult to solve, because modality-specific difference in local anatomic appearance can result in incorrect displacements. We use a simulation of ultrasonic effects based on CT information, taking the current estimate of the deformation field into account to properly address orientation-dependant imaging artifacts. This is combined with a robust multi-channel local similarity metric, driving a variational registration framework. Because of the high computational demand, an efficient GPU-based implementation is used. Preliminary results are shown on data from a number of hepatic cancer patients. To our knowledge, this is the first time that a non-linear mapping of CT and 3D B-mode ultrasound is established in a fast, robust and fully automatic manner.
«
Multi-modality alignment of CT and ultrasound adds value to diagnostic examinations, as well as treatment planning and execution of various clinical procedures. Particularly automatic image-based alignment of such data is challenging, mostly because both modalities have very different imaging physics and characteristics. We present a method for dense-field deformable registration of CT and 3D ultrasound. Compared to global (rigid) alignment, this is more difficult to solve, because modality-...
»