With optical cameras, many interventional navigation tasks previously relying on EM, optical, or mechanical guidance
can be performed robustly, quickly, and conveniently. We developed a family of novel guidance systems based on wide-spectrum
cameras and vision algorithms for real-time tracking of interventional instruments and multi-modality markers.
These navigation systems support the localization of anatomical targets, support placement of imaging probe and
instruments, and provide fusion imaging. The unique architecture – low-cost, miniature, in-hand stereo vision cameras
fitted directly to imaging probes – allows for an intuitive workflow that fits a wide variety of specialties such as
anesthesiology, interventional radiology, interventional oncology, emergency medicine, urology, and others, many of
which see increasing pressure to utilize medical imaging and especially ultrasound, but have yet to develop the requisite
skills for reliable success. We developed a modular system, consisting of hardware (the Optical Head containing the
mini cameras) and software (components for visual instrument tracking with or without specialized visual features, fully automated
marker segmentation from a variety of 3D imaging modalities, visual observation of meshes of widely separated
markers, instant automatic registration, and target tracking and guidance on real-time multi-modality fusion
views). From these components, we implemented a family of distinct clinical and pre-clinical systems (for combinations
of ultrasound, CT, CBCT, and MRI), most of which have international regulatory clearance for clinical use. We present
technical and clinical results on phantoms, ex- and in-vivo animals, and patients.
The concern with interstitial ablative therapy for a treatment of hepatic tumors has been growing. In spite of advances in
these therapies, there are several technical challenges due to tissue deformation and target motion: localization of the
tumor and monitoring for ablator's tip and thermal dose in heated tissue. In the previous work, a steerable acoustic
ablator, called ACUSITT, for targeting of ablation tip accurately into tumor area has been developed. However, real-time
monitoring techniques for providing image feedback of the ablation tip positioning and thermal dose deposited in the
tissue by heating are still needed. In this paper, a new software framework for real-time monitoring ablative therapy
during pre- and intra-operation is presented. The software framework provides ultrasound Brightness Mode (B-Mode)
image and elastography simultaneously and with real-time. A position of ablator's tip and a region of heated tissue are
monitored on B-Mode image, because the image represents tissue morphology. Furthermore, ultrasound elasticity image
is used for finding a boundary and region of tumor on pre-ablation, and monitoring thermal dose in tissue during ablation.
By providing B-Mode image and elastography at the same time, reliable information for monitoring thermal therapy can
This work explores the suitability of low-cost sensors for "serious" medical applications, such as tracking of
interventional tools in the OR, for simulation, and for education. Although such tracking - i.e. the acquisition
of pose data e.g. for ultrasound probes, tissue manipulation tools, needles, but also tissue, bone etc. - is well
established, it relies mostly on external devices such as optical or electromagnetic trackers, both of which
mandate the use of special markers or sensors attached to each single entity whose pose is to be recorded, and
also require their calibration to the tracked entity, i.e. the determination of the geometric relationship between
the marker's and the object's intrinsic coordinate frames. The Microsoft Kinect sensor is a recently introduced
device for full-body tracking in the gaming market, but it was quickly hacked - due to its wide range of tightly
integrated sensors (RGB camera, IR depth and greyscale camera, microphones, accelerometers, and basic
actuation) - and used beyond this area. As its field of view and its accuracy are within reasonable usability
limits, we describe a medical needle-tracking system for interventional applications based on the Kinect
sensor, standard biopsy needles, and no necessary attachments, thus saving both cost and time. Its twin
cameras are used as a stereo pair to detect needle-shaped objects, reconstruct their pose in four degrees of
freedom, and provide information about the most likely candidate.
Handheld ultrasound is useful for intra-operative imaging, but requires additional tracking hardware to be useful
in navigated intervention settings, such as biopsies, ablation therapy, injections etc. Unlike common probe-andneedle
tracking approaches involving global or local tracking, we propose to use a bracket with a combination of
very low-cost local sensors - cameras with projectors, optical mice and accelerometers - to reconstruct patient
surfaces, needle poses, and the probe trajectory with multiple degrees of freedom, but no global tracking overhead.
We report our experiences from a rst series of benchtop and in-vivo human volunteer experiments.
The low-cost and minimum health risks associated with ultrasound (US) have made ultrasonic imaging a widely
accepted method to perform diagnostic and image-guided procedures. Despite the existence of 3D ultrasound probes,
most analysis and diagnostic procedures are done by studying the B-mode images. Currently, multiple ultrasound
probes include 6-DOF sensors that can provide positioning information. Such tracking information can be used to
reconstruct a 3D volume from a set of 2D US images. Recent advances in ultrasound imaging have also shown that,
directly from the streaming radio frequency (RF) data, it is possible to obtain additional information of the anatomical
region under consideration including the elasticity properties.
This paper presents a generic framework that takes advantage of current graphics hardware to create a low-latency
system to visualize streaming US data while combining multiple tissue attributes into a single illustration. In particular,
we introduce a framework that enables real-time reconstruction and interactive visualization of streaming data while
enhancing the illustration with elasticity information. The visualization module uses two-dimensional transfer functions
(2D TFs) to more effectively fuse and map B-mode and strain values into specific opacity and color values. On
commodity hardware, our framework can simultaneously reconstruct, render, and provide user interaction at over 15
fps. Results with phantom and real-world medical datasets show the advantages and effectiveness of our technique with
ultrasound data. In particular, our results show how two-dimensional transfer functions can be used to more effectively
identify, analyze and visualize lesions in ultrasound images.
Many recent studies have demonstrated the efficacy of interstitial ablative approaches for the treatment of hepatic tumors. Despite these promising results, current systems remain highly dependent on operator skill, and cannot treat many tumors because there is little control of the size and shape of the zone of necrosis, and no control over ablator trajectory within tissue once insertion has taken place. Additionally, tissue deformation and target motion make it extremely difficult to place the ablator device precisely into the target. Irregularly shaped target volumes typically require multiple insertions and several overlapping (thermal) lesions, which are even more challenging to accomplish in a precise, predictable, and timely manner without causing excessive damage to surrounding normal tissues.
In answer to these problems, we have developed a steerable acoustic ablator called the ACUSITT with the ability of directional energy delivery to precisely shape the applied thermal dose . In this paper, we address image guidance for this device, proposing an innovative method for accurate tracking and tool registration with spatially-registered intra-operative three-dimensional US volumes, without relying on an external tracking device. This method is applied to guid-ance of the flexible, snake-like, lightweight, and inexpensive ACUSITT to facilitate precise placement of its ablator tip within the liver, with ablation monitoring via strain imaging. Recent advancements in interstitial high-power ultrasound applicators enable controllable and penetrating heating patterns which can be dynamically altered. This paper summarizes the design and development of the first synergistic system that integrates a novel steerable interstitial acoustic ablation device with a novel trackerless 3DUS guidance strategy.
Out-of-plane motion in freehand 3D ultrasound can be estimated using the correlation of corresponding patches,
leading to sensorless freehand 3D ultrasound systems. The correlation between two images is related to their
distance by calibrating the ultrasound probe: the probe is moved with an accurate stage (or with a robot in
this work) and images of a phantom are collected, such that the position of each image is known. Since parts
of the calibration curve with higher derivative gives lower displacement estimation error, previous work limits
displacement estimation to parts with maximum derivative. In this paper, we first propose a novel method for
exploiting the entire calibration curve by using a maximum likelihood estimator (MLE). We then propose for
the first time using constrains inside the image to enhance the accuracy of out-of-plane motion estimation. We
specifically use continuity constraint of a needle to reduce the variance of the estimated out-of-plane motion.
Simulation and real tissue experimental results are presented.
Steerability in percutaneous medical devices is highly desirable, enabling a needle or needle-like instrument to avoid
sensitive structures (e.g. nerves or blood vessels), access obstructed anatomical targets, and compensate for the
inevitable errors induced by registration accuracy thresholds and tissue deformation during insertion. Thus, mechanisms
for needle steering have been of great interest in the engineering community in the past few years, and several have been
proposed. While many interventional applications have been hypothesized for steerable needles (essentially anything
deliverable via a regular needle), none have yet been demonstrated as far as the authors are aware. Instead, prior studies
have focused on model validation, control, and accuracy assessment. In this paper, we present the first integrated
steerable needle-interventional device. The ACUSITT integrates a multi-tube steerable Active Cannula (AC) with an
Ultrasonic Interstitial Thermal Therapy ablator (USITT) to create a steerable percutaneous device that can deliver a
spatially and temporally controllable (both mechanically and electronically) thermal dose profile. We present our initial
experiments toward applying the ACUSITT to treat large liver tumors through a single entry point. This involves
repositioning the ablator tip to several different locations, without withdrawing it from the liver capsule, under 3D
Ultrasound image guidance. In our experiments, the ACUSITT was deployed to three positions, each 2cm apart in a conical pattern to demonstrate the feasibility of ablating large liver tumors 7cm in diameter without multiple parenchyma punctures.
We present an image-guided intervention system based on tracked 3D elasticity imaging (EI) to provide a novel
interventional modality for registration with pre-operative CT. The system can be integrated in both laparoscopic and
robotic partial nephrectomies scenarios, where this new use of EI makes exact intra-operative execution of pre-operative
planning possible. Quick acquisition and registration of 3D-B-Mode and 3D-EI volume data allows intra-operative
registration with CT and thus with pre-defined target and critical regions (e.g. tumors and vasculature). Their real-time
location information is then overlaid onto a tracked endoscopic video stream to help the surgeon avoid vessel damage
and still completely resect tumors including safety boundaries.
The presented system promises to increase the success rate for partial nephrectomies and potentially for a wide range of
other laparoscopic and robotic soft tissue interventions. This is enabled by the three components of robust real-time
elastography, fast 3D-EI/CT registration, and intra-operative tracking. With high quality, robust strain imaging (through
a combination of parallelized 2D-EI, optimal frame pair selection, and optimized palpation motions), kidney tumors that
were previously unregistrable or sometimes even considered isoechoic with conventional B-mode ultrasound can now be
imaged reliably in interventional settings. Furthermore, this allows the transformation of planning CT data of kidney
ROIs to the intra-operative setting with a markerless mutual-information-based registration, using EM sensors for intraoperative
Overall, we present a complete procedure and its development, including new phantom models - both ex vivo and
synthetic - to validate image-guided technology and training, tracked elasticity imaging, real-time EI frame selection,
registration of CT with EI, and finally a real-time, distributed software architecture. Together, the system allows the
surgeon to concentrate on intervention completion with less time pressure.