Echocardiography (echo) is the most common test for diagnosis and management of patients with cardiac condi- tions. While most medical imaging modalities benefit from a relatively automated procedure, this is not the case for echo and the quality of the final echo view depends on the competency and experience of the sonographer. It is not uncommon that the sonographer does not have adequate experience to adjust the transducer and acquire a high quality echo, which may further affect the clinical diagnosis. In this work, we aim to aid the operator during image acquisition by automatically assessing the quality of the echo and generating the Automatic Echo Score (AES). This quality assessment method is based on a deep convolutional neural network, trained in an end-to-end fashion on a large dataset of apical four-chamber (A4C) echo images. For this project, an expert car- diologist went through 2,904 A4C images obtained from independent studies and assessed their condition based on a 6-scale grading system. The scores assigned by the expert ranged from 0 to 5. The distribution of scores among the 6 levels were almost uniform. The network was then trained on 80% of the data (2,345 samples). The average absolute error of the trained model in calculating the AES was 0.8 ± 0:72. The computation time of
the GPU implementation of the neural network was estimated at 5 ms per frame, which is sufficient for real-time
In low-dose prostate brachytherapy treatment, a large number of radioactive seeds is implanted in and adjacent to the prostate gland. Planning of this treatment involves the determination of a Planning Target Volume (PTV), followed by defining the optimal number of seeds, needles and their coordinates for implantation. The two major planning tasks, i.e. PTV determination and seed definition, are associated with inter- and intra-expert variability. Moreover, since these two steps are performed in sequence, the variability is accumulated in the overall treatment plan. In this paper, we introduce a model based on a data fusion technique that enables joint determination of PTV and the minimum Prescribed Isodose (mPD) map. The model captures the correlation between different information modalities consisting of transrectal ultrasound (TRUS) volumes, PTV and isodose contours. We take advantage of joint Independent Component Analysis (jICA) as a linear decomposition technique to obtain a set of joint components that optimally describe such correlation. We perform a component stability analysis to generate a model with stable parameters that predicts the PTV and isodose contours solely based on a new patient TRUS volume. We propose a framework for both modeling and prediction processes and evaluate it on a dataset of 60 brachytherapy treatment records. We show PTV prediction error of 10:02±4:5% and the V100 isodose overlap of 97±3:55% with respect to the clinical gold standard.
Brachytherapy as one of the treatment methods for prostate cancer takes place by implantation of radioactive seeds inside the gland. The standard of care for this treatment procedure is to acquire transrectal ultrasound images of the prostate which are segmented in order to plan the appropriate seed placement. The segmentation process is usually performed either manually or semi-automatically and is associated with subjective errors because the prostate visibility is limited in ultrasound images. The current segmentation process also limits the possibility of intra-operative delineation of the prostate to perform real-time dosimetry. In this paper, we propose a computationally inexpensive and fully automatic segmentation approach that takes advantage of previously segmented images to form a joint space of images and their segmentations. We utilize joint Independent Component Analysis method to generate a model which is further employed to produce a probability map of the target segmentation. We evaluate this approach on the transrectal ultrasound volume images of 60 patients using a leave-one-out cross-validation approach. The results are compared with the manually segmented prostate contours that were used by clinicians to plan brachytherapy procedures. We show that the proposed approach is fast with comparable accuracy and precision to those found in previous studies on TRUS segmentation.
In this paper, we present a registration pipeline to compensate for prostate motion and deformation during targeted freehand prostate biopsies. We perform 2D-3D registration by reconstructing a thin-volume around the real-time 2D ultrasound imaging plane. Constrained Sum of Squared Differences (SSD) and gradient descent optimization are used to rigidly align the moving volume to the fixed thin-volume. Subsequently, B-spline de- formable registration is performed to compensate for remaining non-linear deformations. SSD and zero-bounded Limited memory Broyden Fletcher Goldfarb Shannon (LBFGS) optimizer are used to find the optimum B-spline parameters. Registration results are validated on five prostate biopsy patients. Initial experiments suggest thin- volume-to-volume registration to be more effective than slice-to-volume registration. Also, a minimum consistent 2 mm improvement of Target Registration Error (TRE) is achieved following the deformable registration.
One of the commonly used treatment methods for early-stage prostate cancer is brachytherapy. The standard of care for planning this procedure is segmentation of contours from transrectal ultrasound (TRUS) images, which closely follow the prostate boundary. This process is currently performed either manually or using semi-automatic techniques. This paper introduces a fully automatic segmentation algorithm which uses a priori knowledge of contours in a reference data set of TRUS volumes. A non-parametric deformable registration method is employed to transform the atlas prostate contours to a target image coordinates. All atlas images are sorted based on their registration results and the highest ranked registration results are selected for decision fusion. A Simultaneous Truth and Performance Level Estimation algorithm is utilized to fuse labels from registered atlases and produce a segmented target volume. In this experiment, 50 patient TRUS volumes are obtained and a leave-one-out study on TRUS volumes is reported. We also compare our results with a state-of-the-art semi-automatic prostate segmentation method that has been clinically used for planning prostate brachytherapy procedures and we show comparable accuracy and precision within clinically acceptable runtime.