The pigtail catheter is a type of catheter inserted into the human body during interventional surgeries such
as the transcatheter aortic valve implantation (TAVI). The catheter is characterized by a tightly curled end in
order to remain attached to a valve pocket during the intervention, and it is used to inject contrast agent for the
visualization of the vessel in fluoroscopy. Image-based detection of this catheter is used during TAVI, in order to
overlay a model of the aorta and enhance visibility during the surgery. Due to the different possible projection
angles in fluoroscopy, the pigtail tip can appear in a variety of different shapes spanning from pure circular to
ellipsoid or even line. Furthermore, the appearance of the catheter tip is radically altered when the contrast
agent is injected during the intervention or when it is occluded by other devices. All these factors make the
robust real-time detection and tracking of the pigtail catheter a challenging task. To address these challenges,
this paper proposes a new tree-structured, hierarchical detection scheme, based on a shape categorization of the
pigtail catheter tip, and a combination of novel Haar features. The proposed framework demonstrates improved
detection performance, through a validation on a data set consisting of 272 sequences with more than 20,000
images. The detection framework presented in this paper is not limited to pigtail catheter detection, but it can
also be applied successfully to any other shape-varying object with similar characteristics.
In this paper we present a learning-based guidewire localization algorithm which can be constrained by user inputs. The
proposed algorithm automatically localizes guidewires in fluoroscopic images. In cases where the results are not satisfactory,
the user can provide input to constrain the algorithm by clicking on the guidewire segment missed by the detection
algorithm. The algorithm then re-localizes the guidewire and updates the result in less than 0.3 second. In extreme cases,
more constraints can be provided until a satisfactory result is reached. The proposed algorithm can not only serve as an
efficient initialization tool for guidewire tracking, it can also serve as an efficient annotation tool, either for cardiologists
to mark the guidewire, or to build up a labeled database for evaluation. Through the improvement of the initialization of
guidewire tracking, it also helps to improve the visibility of the guidewire during interventional procedures. Our study
shows that even highly complicated guidewires can mostly be localized within 5 seconds by less than 6 clicks.
In this paper, we present a novel hierarchical framework of guidewire tracking for image-guided interventions. Our method
can automatically and robustly track a guidewire in fluoroscopy sequences during interventional procedures. The method
consists of three main components: learning based guidewire segment detection, robust and fast rigid tracking, and nonrigid
guidewire tracking. Each component aims to handle guidewire motion at a specific level. The learning based segment
detection identifies small segments of a guidewire at the level of individual frames, and provides unique primitive features
for subsequent tracking. Based on identified guidewire segments, the rigid tracking method robustly tracks the guidewire
across successive frames, assuming that a major motion of guidewire is rigid, mainly caused by the breathing motion and
table movement. Finally, a non-rigid tracking algorithm is applied to finely deform the guidewire to provide accurate shape.
The presented guidewire tracking method has been evaluated on a test set of 47 sequences with more than 1000 frames.
Quantitative evaluation demonstrates that the mean tracking error on the guidewire body is less than 2 pixels. Therefore
the presented guidewire tracking method has a great potential for applications in image guided interventions.