Image-guided surgery (IGS) allows clinicians to view current, intra-operative scenes superimposed on preoperative
images (typically MRI or CT scans). IGS systems use localization systems to track and visualize surgical tools overlaid
on top of preoperative images of the patient during surgery. The most commonly used localization systems in the
Operating Rooms (OR) are optical tracking systems (OTS) due to their ease of use and cost effectiveness. However,
OTS' suffer from the major drawback of line-of-sight requirements. State space approaches based on different
implementations of the Kalman filter have recently been investigated in order to compensate for short line-of-sight
occlusion. However, the proposed parameterizations for the rigid body orientation suffer from singularities at certain
values of rotation angles. The purpose of this work is to develop a quaternion-based Unscented Kalman Filter (UKF) for
robust optical tracking of both position and orientation of surgical tools in order to compensate marker occlusion issues.
This paper presents preliminary results towards a Kalman-based Sensor Management Engine (SME). The engine will
filter and fuse multimodal tracking streams of data. This work was motivated by our experience working in robot-based
applications for keyhole neurosurgery (ROBOCAST project). The algorithm was evaluated using real data from NDI
Polaris tracker. The results show that our estimation technique is able to compensate for marker occlusion with a
maximum error of 2.5° for orientation and 2.36 mm for position. The proposed approach will be useful in over-crowded
state-of-the-art ORs where achieving continuous visibility of all tracked objects will be difficult.