Purpose: This study aims to investigate the accuracy of a cross platform augmented reality (AR) system for percutaneous needle interventions irrespective of operator error. In particular, we study the effect of the relative position and orientation of the AR device and the marker, the location of the target, and the angle of needle on the overlay accuracy. Method: A needle guidance AR platform developed using Unity and Vuforia SDK platforms was used to display a planned needle trajectory for targets via mobile and wearable devices. To evaluate the system accuracy, a custom phantom embedded with metal fiducial markers and an adjustable needle guide was designed to mimic different relative position and orientation scenarios of the smart device and the marker. After segmenting images of CT-visible fiducial markers as well as different needle trajectories, error was defined by comparing them to the corresponding augmented target/needle trajectory projected by smartphone and smartglasses devices. Results: The augmentation error for targets and needle trajectories were reported as a function of marker position and orientation, as well as the location of the targets. Overall, the image overlay error for needle trajectory was 0.28±0.32° (Max = 0.856°) and 0.41±0.23° (Max = 0.805°) using the iPhone and HoloLens glasses, respectively. The overall image overlay error for targets was 1.75±0.59 mm for iPhone, and 1.74±0.86 mm for HoloLens. Conclusions: The image overlay error caused by different sources can be quantified for different AR devices.
In transperineal prostate biopsy or ablation, a grid-template is typically used to guide the needle. The guidance method has limited positioning resolution and lack of needle angulation selections that are referenced to ultrasound imaging or TRUS-MRI fusion targets. To overcome the limitation, a novel augmented reality (AR) system that use smart see-through glasses and smartphone as a needle guidance device for transperineal prostate procedure was developed. The AR system is comprised of a MRI/CT scanner, a pre-procedural image analysis and visualization software, AR devices (smart-glasses, smartphone), a newly-developed AR app, as well as a local network. The AR app displays the lesion and planned needle trajectory, which are derived from the pre-procedural images, on the AR devices. A special designed image marker frame that affixed to the patient’s perineum was used to track the pre-procedural image with the AR devices. The displayed needle plan was always referenced to the patient and remains independent from the position and orientation of the devices. Multiple devices can be used simultaneously and communicate via a local network. We evaluated the AR system accuracy with iPhone and R-7 glasses in a phantom study. The image overlay accuracy was 0.58±0.43o and 1.62±1.52o in iPhone and R-7 glasses respectively. The accuracy of iPhone guidance was 1.9±0.97 mm (lateral) and 1.0±0.5 mm (in-direction), the accuracy of R-7 guidance was 2.8±1.4mm (lateral) and 2.3±1.5mm (indirection). AR system using smart-glasses and smartphone can provide accurate needle guidance and see-through-the-skin display for needle based transperineal prostate interventions like biopsy and ablation.
Purpose: A new version of a grid-template-mimicking MR-compatible robot is developed to assist during in-gantry MRI-guided focal laser ablation of prostate cancer. This robot replaces the grid template and provides higher positioning resolution, and allows autonomous needle alignment directly from the targeting and navigation software, with the needle insertion manually performed for safety. Method: A substantially more compact solution is designed and prototyped to allow comfortable accommodation between the patient’s legs while in MRI bore. The controller software was reconfigured and embedded into the custom navigation and multi-focal ablation software, OncoNav (NIH). OncoNav performs robot-to-image registration, target planning, controlling the robot, ablation planning, and 3D temperature analysis for monitoring. For free space accuracy study, 5 targets were selected across the workspace and the robot was commanded 5 times to each target. Then, a thermochromic phantom study was designed consisting of acrylamide gel and color changing ink for testing the overall workflow. 4 spherical metal fiducials were embedded in the phantom at different locations. After each targeting, laser ablation was applied in two of the targets. Finally, the phantom was sliced for gross observation of guidance and treatment accuracy. Results: in-the-air accuracy was 0.38±0.27 mm. The overall targeting accuracy including robot, registration, and insertion error was 2.17±0.47 mm in phantom. Ablation successfully covered ellipsoids around the targets. The workflow was acceptably smooth. Conclusions: The new robot can accurately assist in targeting small targets followed by focal laser ablation. Sterility and regulatory hurdles will be addressed with specific design approaches as the next step.
Accurate needle placement largely depends on physicians’ visuospatial skills in CT-guided interventions. To reduce the reliance on operator experience and enhance accuracy, we developed an augmented reality system using smart seethrough glasses to facilitate and assist bedside needle angle guidance. The AR system was developed using Unity and Vuforia SDK. It displays the planned needle angle on the glasses’ see-through screens in real-time based on the glasses orientation. The displayed angle is always referenced to the CT table and independent from the physical orientation of the glasses. The see-through feature allows the operator to compare the actual needle and the planned needle angle continuously. The glasses’ orientation was tracked by its built-in gyroscope. The offset between the embedded gyroscope and the glasses’ display frame was pre-calibrated. A quick one-touch calibration method between the glasses and CT frame was implemented. Hardware accuracy and guidance accuracy was evaluated in phantom studies. In the first test, a needle was inserted in the phantom and scanned with CT. The measured angle in the CT scan was set on the glasses. We took a snapshot from the lens and compared the needle vector and guideline in the saved snapshot. The hardware accuracy was within 0.98 ± 0.85 degree. In the second test, after each insertion guided by the glasses, a CT scan was taken to validate the insertion angle error. The accuracy of the guidance was within 1.33 ± 0.73 degree. Smart glasses can provide accurate guidance for needle based interventions with minimal disturbance of the standard clinical workflow.
Monitoring temperature during a cone-beam CT (CBCT) guided ablation procedure is important for prevention of over-treatment and under-treatment. In order to accomplish ideal temperature monitoring, a thermometry map must be generated. Previously, this was attempted using CBCT scans of a pig shoulder undergoing ablation.1 We are extending this work by using CBCT scans of real patients and incorporating more processing steps. We register the scans before comparing them due to the movement and deformation of organs. We then automatically locate the needle tip and the ablation zone. We employ a robust change metric due to image noise and artifacts. This change metric takes windows around each pixel and uses an equation inspired by Time Delay Analysis to calculate the error between windows with the assumption that there is an ideal spatial offset. Once the change map is generated, we correlate change data with measured temperature data at the key points in the region. This allows us to transform our change map into a thermal map. This thermal map is then able to provide an estimate as to the size and temperature of the ablation zone. We evaluated our procedure on a data set of 12 patients who had a total of 24 ablation procedures performed. We were able to generate reasonable thermal maps with varying degrees of accuracy. The average error ranged from 2.7 to 16.2 degrees Celsius. In addition to providing estimates of the size of the ablation zone for surgical guidance, 3D visualizations of the ablation zone and needle are also produced.
Temperature monitoring and therefore the final treatment zone achieved during a cone-beam CT (CBCT) guided ablation can prevent overtreatment and undertreatment. A novel method is proposed to detect changes in consecutive CBCT images obtained from projection reconstructions during an ablation procedure. The possibility is explored of using this method to generate thermometry maps from CBCT images, which can be used as an input function for ablation treatment planning. This novel method uses a baseline and an intermittent CBCT scan, which are routinely acquired to confirm the needle position and monitor progress of the ablation. Accurate registration is required and assumed in vitro and ex vivo. A Wronskian change detector algorithm is applied on the compensated images to obtain a difference image between the intermittent and baseline scans. Finally, a thermal map created by applying a calibration determined experimentally is used to obtain the corresponding temperature at each pixel or voxel. We applied Wronskian change detector to detect the difference of two CBCT images, which have low signal to noise ratio, and calibrate Wronskian change model to temperature data using a gel phantom. We tested the temperature mapping with water and gel phantoms as well as pig shoulder. The experimental results show this method can detect temperature change within 5°C for a voxel size of 1mm3 (within clinical relevancy), and by consequence delineate the ablation zone. The preliminary experimental results show that CBCT thermometry is possible and promising, but may require pre-processing, such as registration for motion compensation between the baseline and intermittent scans. Further, quantitative evaluations have to be conducted for validation prior to clinical assessment and translation. CBCT is a widely available technology that could make thermometry clinically practical as an enabling component of iterative ablation treatment planning.
We present a method towards optimization of multiple ablation probe placement to provide efficient coverage
of a tumor for thermal therapy while respecting clinical needs such as limiting the sites of probe insertions at
the pleura/liver surface, choosing secure probe trajectories and locations, avoiding ablation of critical structures,
reducing ablation of healthy tissue and overlap of ablation zones. The ablation optimizer treats each ablation
location independently, and the number of ablation probe placements itself is treated as a variable to be optimized.
This allows us to potentially feedback the ablation after deployment and re-optimize the next steps during the
plan. The optimization method uses a new class of derivate-free algorithms for solving a non-linear mixed
variable problem with hard and soft constraints derived from clinical images. Our methods use discretization
of the ablation volume, which can accommodate irregular shape of the ablation zone. The non-gradient based
strategy produce new candidates to yield a feasible solution within a few iterations. In our simulation experiments
this strategy typically reduced the ablation zone overlap and ablated healthy tissue ablated by 46% and 29%,
respectively in a single iteration, resulting in a feasible solution to be found within 35 iterations. Our method
for optimization provides efficient implementation for planning the coverage of a tumor while respecting clinical
constraints. The ablation planning can be combined with navigation assistance to enable accurate translation
and feedback of the plan.
In this paper we present a surgical assistant system for implanting prosthetic aortic valve transapically under MRI
guidance, in a beating heart. The system integrates an MR imaging system, a robotic system, as well as user interfaces
for a surgeon to plan the procedure and manipulate the robot. A compact robotic delivery module mounted on a robotic
arm is used for delivering both balloon-expandable and self-expanding prosthesis. The system provides different user
interfaces at different stages of the procedure. A compact fiducial pattern close to the volume of interest is proposed for
robot registration. The image processing and the transformation recovery methods using this fiducial in MRI are
presented. The registration accuracy obtained by using this compact fiducial is comparable to the larger multi-spherical
marker registration method. The registration accuracy using these two methods is less than 0.62±0.50 deg (mean ± std.
dev.) and 0.63±0.72 deg (mean ± std. dev.), respectively. We evaluated each of the components and show that they can
work together to form a complete system for transapical aortic valve replacement.