Ultrasound (US) guided prostate brachytherapy is a minimally invasive form of cancer treatment during which a needle is used to insert radioactive seeds into the prostate at pre-planned positions. Interaction with the needle can cause the prostate to deform and this can lead to inaccuracy in seed placement. Virtual reality (VR) simulation could provide a way for surgical residents to practice compensating for these deformations. To facilitate such a tool, we have developed a hybrid deformable model that combines ChainMail distance constraints with mass-spring physics to provide realistic, yet customizable deformations. Displacements generated by the model were used to warp a baseline US image to simulate an acquired US sequence. The algorithm was evaluated using a gelatin phantom with a Young's modulus approximately equal to that of the prostate (60 kPa). A 2D US movie was acquired while the phantom underwent needle insertion and inter-frame displacements were calculated using normalized cross correlation. The hybrid model was used to simulate the same needle insertion and the two sets of displacements were compared on a frame-by-frame basis. The average perpixel displacement error was 0.210 mm. A simulation rate of 100 frames per second was achieved using a 1000 element triangular mesh while warping a 300x400 pixel US image on an AMD Athlon 1.1 Ghz computer with 1 GB of RAM and an ATI Radeon 9800 Pro graphics card. These results show that this new deformable model can provide an accurate solution to the problem of simulating real-time prostate brachytherapy.
We have implemented two hardware accelerated Thin Plate Spline (TPS) warping algorithms. The first algorithm is a hardware-software approach (HW-TPS) that uses OpenGL Vertex Shaders to perform a grid warp. The second is a Graphics Processor based approach (GPU-TPS) that uses the OpenGL Shading Language to perform all warping calculations on the GPU. Comparison with a software TPS algorithm was used to gauge the speed and quality of both hardware algorithms. Quality was analyzed visually and using the Sum of Absolute Difference (SAD) similarity metric. Warping was performed using 92 user-defined displacement vectors for 512x512x173 serial lung CT studies, matching normal-breathing and deep-inspiration scans. On a Xeon 2.2 Ghz machine with an ATI Radeon 9800XT GPU the GPU-TPS required 26.1 seconds to perform a per-voxel warp compared to 148.2 seconds for the software algorithm. The HW-TPS needed 1.63 seconds to warp the same study while the GPU-TPS required 1.94 seconds and the software grid transform required 22.8 seconds. The SAD values calculated between the outputs of each algorithm and the target CT volume were 15.2%, 15.4% and 15.5% for the HW-TPS, GPU-TPS and both software algorithms respectively. The computing power of ubiquitous 3D graphics cards can be exploited in medical image processing to provide order of magnitude acceleration of nonlinear warping algorithms without sacrificing output quality.