In this paper, we present an on-the-fly surgical support system that provides guidance using augmented reality and can be used in quasi-unprepared operating rooms. The proposed system builds upon a multi-modality marker and simultaneous localization and mapping technique to co-calibrate an optical see-through head mounted display to a C-Arm fluoroscopy system. Then, annotations on the 2-D X-Ray images can be rendered as virtual objects in 3-D providing surgical guidance. In a feasibility study on a semi-anthropomorphic phantom we found the accuracy of our system to be comparable to the traditional image-guided technique while substantially reducing the number of acquired X-Ray images as well as procedure time. Our promising results encourage further research on the interaction between virtual and real objects, that we believe will directly benefit the proposed method. Further, we would like to explore the capabilities of our on-the-fly augmented reality support system in a larger study directed towards common orthopaedic interventions.
ACCESS THE FULL ARTICLE
Sebastian Andress, Alex Johson, Mathias Unberath, Alexander Winkler, Kevin Yu, Javad Fotouhi, Simon Weidert, Greg Osgood, Nassir Navab, "Technical note: on-the-fly augmented reality for orthopaedic surgery using a multi-modal fiducial," Proc. SPIE 10576, Medical Imaging 2018: Image-Guided Procedures, Robotic Interventions, and Modeling, 105760H (13 March 2018);