Robot-assisted laparoscopic surgery is becoming an established technique for prostatectomy and is increasingly being explored for other types of cancer. Linking intraoperative imaging techniques, such as fluorescence guidance, with the three-dimensional insights provided by preoperative imaging remains a challenge. Navigation technologies may provide a solution, especially when directly linked to both the robotic setup and the fluorescence laparoscope. We evaluated the feasibility of such a setup. Preoperative single-photon emission computed tomography/X-ray computed tomography (SPECT/CT) or intraoperative freehand SPECT (fhSPECT) scans were used to navigate an optically tracked robot-integrated fluorescence laparoscope via an augmented reality overlay in the laparoscopic video feed. The navigation accuracy was evaluated in soft tissue phantoms, followed by studies in a human-like torso phantom. Navigation accuracies found for SPECT/CT-based navigation were 2.25 mm (coronal) and 2.08 mm (sagittal). For fhSPECT-based navigation, these were 1.92 mm (coronal) and 2.83 mm (sagittal). All errors remained below the <1-cm detection limit for fluorescence imaging, allowing refinement of the navigation process using fluorescence findings. The phantom experiments performed suggest that SPECT-based navigation of the robot-integrated fluorescence laparoscope is feasible and may aid fluorescence-guided surgery procedures.
Cutaneous T-Cell Lymphoma (CTCL) is a cancer type externally characterized by alterations in the coloring of skin.
Optical spectroscopy has been proposed for quantification of minimal changes in skin offering itself as an interesting tool
for monitoring of CTCL in real-time. However, in order to be used in a valid way, measurements on the lesions have to
be taken at the same position and with the same orientation in each session. Combining hand-held optical spectroscopy
devices with tracking and acquiring synchronously spectral information with position and orientation, we introduce a
novel computer-assisted scheme for valid spectral quantification of disease progression. We further present an
implementation for an augmented reality guidance system that allows to find a point previously analyzed with an
accuracy of 0.8[mm] and 5.0[deg] (vs. 1.6[mm] and 6.6[deg] without guidance). The intuitive guidance, as well as the
preliminary results shows that the presented approach has great potential towards innovative computer-assistance
methods for quantification of disease progression.