We present a new wavelet-based strategy for autonomous feature extraction and segmentation of cardiac structures in dynamic ultrasound images. Image sequences subjected to a multidimensional (2D plus time) wavelet transform yield a large number of individual subbands, each coding for partial structural and motion information of the ultrasound sequence. We exploited this fact to create an analysis strategy for autonomous analysis of cardiac ultrasound that builds on shape- and motion specific wavelet subband filters. Subband selection was in an automatic manner based on subband statistics. Such a collection of predefined subbands corresponds to the so-called footprint of the target structure and can be used as a multidimensional multiscale filter to detect and localize the target structure in the original ultrasound sequence. Autonomous, unequivocal localization by the autonomous algorithm is then done using a peak finding algorithm, allowing to compare the findings with a reference standard. Image segmentation is then possible using standard region growing operations. To test the feasibility of this multiscale footprint algorithm, we tried to localize, enhance and segment the mitral valve autonomously in 182 non-selected clinical cardiac ultrasound sequences. Correct autonomous localization by the algorithm was feasible in 165 of 182 reconstructed ultrasound sequences, using the experienced echocardiographer as reference. This corresponds to a 91% accuracy of the proposed method in unselected clinical data. Thus, multidimensional multiscale wavelet footprints allow successful autonomous detection and segmentation of the mitral valve with good accuracy in dynamic cardiac ultrasound sequences which are otherwise difficult to analyse due to their high noise level.
We present a new framework to estimate and visualize heart motion from echocardiograms. For velocity estimation, we have developed a novel multiresolution optical flow algorithm. In order to account for typical heart motions like contraction/expansion and shear, we use a local affine model for the velocity in space and time. The motion parameters are estimated in the least-squares sense inside a sliding spatio-temporal window.
The estimated velocity field is used to track a region of interest which is represented by spline curves. In each frame, a set of sample points on the curves is displaced according to the estimated motion field. The contour in the subsequent frame is obtained by a least-squares spline fit to the displaced sample points. This ensures robustness of the contour tracking. From the estimated velocity, we compute a radial velocity field with respect to a reference point. Inside the time-varying region of interest, the radial velocity is color-coded and superimposed on the original image sequence in a semi-transparent fashion. In contrast to conventional Tissue Doppler methods, this approach is independent of the incident angle of the ultrasound beam.
The motion analysis and visualization provides an objective and robust method for the detection and quantification of myocardial malfunctioning. Promising results are obtained from synthetic and clinical echocardiographic sequences.