Automated interpretation of CT scans is an important, clinically relevant area as the number of such scans is increasing rapidly and the interpretation is time consuming. Anatomy localization is an important prerequisite for any such interpretation task. This can be done by image-to-atlas registration, where the atlas serves as a reference space for annotations such as organ probability maps. Tissue type based atlases allow fast and robust processing of arbitrary CT scans. Here we present two methods which significantly improve organ localization based on tissue types. A first problem is the definition of tissue types, which until now is done heuristically based on experience. We present a method to determine suitable tissue types from sample images automatically. A second problem is the restriction of the transformation space: all prior approaches use global affine maps. We present a hierarchical strategy to refine this global affine map. For each organ or region of interest a localized tissue type atlas is computed and used for a subsequent local affine registration step. A three-fold cross validation on 311 CT images with different fields-of-view demonstrates a reduction of the organ localization error by 33%.
A fully automatic method generating a whole body atlas from CT images is presented. The atlas serves as a reference space for annotations. It is based on a large collection of partially overlapping medical images and a registration scheme. The atlas itself consists of probabilistic tissue type maps and can represent anatomical variations. The registration scheme is based on an entropy-like measure of these maps and is robust with respect to field-of-view variations. In contrast to other atlas generation methods, which typically rely on a sufficiently large set of annotations on training cases, the presented method requires only the images. An iterative refinement strategy is used to automatically stitch the images to build the atlas.
Affine registration of unseen CT images to the probabilistic atlas can be used to transfer reference annotations, e.g. organ models for segmentation initialization or reference bounding boxes for field-of-view selection. The robustness and generality of the method is shown using a three-fold cross-validation of the registration on a set of 316 CT images of unknown content and large anatomical variability. As an example, 17 organs are annotated in the atlas reference space and their localization in the test images is evaluated. The method yields a recall (sensitivity), specificity and precision of at least 96% and thus performs excellent in comparison to competitors.
Prostate and cervix cancer diagnosis and treatment planning that is based on MR images benefit from superior soft tissue contrast compared to CT images. For these images an automatic delineation of the prostate or cervix and the organs at risk such as the bladder is highly desirable. This paper describes a method for bladder segmentation that is based on a watershed transform on high image gradient values and gray value valleys together with the classification of watershed regions into bladder contents and tissue by a graph cut algorithm. The obtained results are superior if compared to a simple region-after-region classification.