A non-rigid registration method is presented for the alignment of pre-procedural magnetic resonance (MR) images with delineated suspicious regions to intra-procedural 3D transrectal ultrasound (TRUS) images in TRUS-guided prostate biopsy. In the first step, 3D MR and TRUS images are aligned rigidly using six pairs of manually identified approximate matching points on the boundary of the prostate. Then, two image volumes are non-rigidly registered using a finite element method (FEM)-based linear elastic deformation model. A vector of observation prediction errors at some points of interest within the prostate volume is computed using an intensity-based similarity metric called the modality independent neighborhood descriptor (MIND). The error vector is employed in a classical state estimation framework to estimate prostate deformation between MR and TRUS images. The points of interests are identified using speeded-up robust features (SURF) that are scale and rotation-invariant descriptors in MR images. The proposed registration method on 10 sets of prostate MR and TRUS images yielded a target registration error of 1.99±0.83 mm, and 1.97±0.87 mm in the peripheral zone (PZ) and whole gland (WG), respectively, using 68 manually-identified fiducial points. The Dice similarity coefficient (DSC) was 87.9±2.9, 82.3±4.8, 93.0±1.7, and 84.2±6.2 percent for the WG, apex, mid-gland and base of the prostate, respectively. Moreover, the mean absolute distances (MAD) between the WG surfaces in the TRUS and registered MR images was 1.6±0.3 mm. Registration results indicate effectiveness of the proposed method in improving the targeting accuracy in the TRUS-guided prostate biopsy.
Magnetic resonance imaging (MRI) is being increasingly used for image-guided biopsy and focal therapy of prostate cancer. A combined rigid and deformable registration technique is proposed to register pre-treatment diagnostic 3T magnetic resonance (MR) images, with the identified target tumor(s), to the intra-treatment 1.5T MR images. The pre-treatment 3T images are acquired with patients in strictly supine position using an endorectal coil, while 1.5T images are obtained intra-operatively just before insertion of the ablation needle with patients in the lithotomy position. An intensity-based registration routine rigidly aligns two images in which the transformation parameters is initialized using three pairs of manually selected approximate corresponding points. The rigid registration is followed by a deformable registration algorithm employing a generic dynamic linear elastic deformation model discretized by the finite element method (FEM). The model is used in a classical state estimation framework to estimate the deformation of the prostate based on a similarity metric between pre- and intra-treatment images. Registration results using 10 sets of prostate MR images showed that the proposed method can significantly improve registration accuracy in terms of target registration error (TRE) for all prostate substructures. The root mean square (RMS) TRE of 46 manually identified fiducial points was found to be 2.40±1.20 mm, 2.51±1.20 mm, and 2.28±1.22mm for the whole gland (WG), central gland (CG), and peripheral zone (PZ), respectively after deformable registration. These values are improved from 3.15±1.60 mm, 3.09±1.50 mm, and 3.20±1.73mm in the WG, CG and PZ, respectively resulted from rigid registration. Registration results are also evaluated based on the Dice similarity coefficient (DSC), mean absolute surface distances (MAD) and maximum absolute surface distances (MAXD) of the WG and CG in the prostate images.
Real-time registration of pre-operative magnetic resonance (MR) or computed tomography (CT) images with intra-operative Ultrasound (US) images can be a valuable tool in image-guided therapies and interventions. This paper presents an automatic method for dynamically tracking the deformation of a soft tissue based on registering pre-operative three-dimensional (3D) MR images to intra-operative two-dimensional (2D) US images. The registration algorithm is based on concepts in state estimation where a dynamic finite element (FE)- based linear elastic deformation model correlates the imaging data in the spatial and temporal domains. A Kalman-like filtering process estimates the unknown deformation states of the soft tissue using the deformation model and a measure of error between the predicted and the observed intra-operative imaging data. The error is computed based on an intensity-based distance metric, namely, modality independent neighborhood descriptor (MIND), and no segmentation or feature extraction from images is required. The performance of the proposed method is evaluated by dynamically deforming 3D pre-operative MR images of a breast phantom tissue based on real-time
2D images obtained from an US probe. Experimental results on different registration scenarios showed that deformation tracking converges in a few iterations. The average target registration error on the plane of 2D US images for manually selected fiducial points was between 0.3 and 1.5 mm depending on the size of deformation.