As part of an ongoing theme in our laboratory on reducing morbidity during minimally-invasive intracardiac
procedures, we developed a computer-assisted intervention system that provides safe access inside the beating
heart and sufficient visualization to deliver therapy to intracardiac targets while maintaining the efficacy of the
procedure. Integrating pre-operative information, 2D trans-esophageal ultrasound for real-time intra-operative
imaging, and surgical tool tracking using the NDI Aurora magnetic tracking system in an augmented virtual
environment, our system allows the surgeons to navigate instruments inside the heart in spite of the lack of
direct target visualization. This work focuses on further enhancing intracardiac visualization and navigation by
supplying the surgeons with detailed 3D dynamic cardiac models constructed from high-resolution pre-operative
MR data and overlaid onto the intra-operative imaging environment. Here we report our experience during an in
vivo porcine study. A feature-based registration technique previously explored and validated in our laboratory
was employed for the pre-operative to intra-operative mapping. This registration method is suitable for in
vivo interventional applications as it involves the selection of easily identifiable landmarks, while ensuring a good
alignment of the pre-operative and intra-operative surgical targets. The resulting augmented reality environment
fuses the pre-operative cardiac model with the intra-operative real-time US images with approximately 5 mm
accuracy for structures located in the vicinity of the valvular region. Therefore, we strongly believe that our
augmented virtual environment significantly enhances intracardiac navigation of surgical instruments, while on-target
detailed manipulations are performed under real-time US guidance.
Minimally invasive surgery of the beating heart can be associated with two major limitations: selecting port locations for optimal target coverage from x-rays and angiograms, and navigating instruments in a dynamic and confined 3D environment using only an endoscope. To supplement the current surgery planning and guidance strategies, we continue developing VCSP - a virtual reality, patient-specific, thoracic cavity model derived from 3D pre-procedural images. In this work, we apply elastic image registration to 4D cardiac images to model the dynamic heart. Our method is validated on two image modalities, and for different parts of the cardiac anatomy. In a helical CT dataset of an excised heart phantom, we found that the artificial motion of the epicardial surface can be extracted to within 0.93 ± 0.33 mm. For an MR dataset of a human volunteer, the error for different heart structures such as the myocardium, right and left atria, right ventricle, aorta, vena cava, and pulmonary artery, ranged from 1.08 ± 0.18 mm to 1.14 ± 0.22 mm. These results indicate that our method of modeling the motion of the heart is not only easily adaptable but also sufficiently accurate to meet the requirements for reliable cardiac surgery training, planning, and guidance.
While most currently available minimally invasive robotically assisted cardiac surgical systems do not employ 3D image guidance, such support can be generated using pre operative images such as CT. Previously we demonstrated a virtual model of the thorax with simulated surgical instruments, and a pulsating virtual model of the coronary arteries. In this paper we report the overlay of optical endoscopic images of a beating heart phantom with CT-based dynamic volumetric images of the phantom. Spatial matching is obtained through optical tracking of the endoscope and of the phantom, while time synchronization of the display of the model utilizes ECG gating. The spatial accuracy between the optical and virtual images varies from about 0.8 mm to -2.6 mm, while the time discrepancy depends on the frame-rate at which the virtual model is refreshed, and is typically 50-100 ms. Although the CT-based dynamic images are sufficient for animation of the model, artefacts associated with the image registration prevent seamless animation. Instead, to reconstruct the various phases of heart pulsation, we used a high-quality semi-static image of the diastolic phase of the phantom, and warped it to match the CT-based images corresponding to other phases of the heart pulsation.