Ovarian volume assessment is the measurement of the size of ovaries during an Ultrasound (US) in order to estimate the ovarian reserve. Since the ovarian reserve is used in calculating a woman’s reproductive age and is also a diagnostic criterion for polycystic ovary syndrome (PCOS), it is imperative that it is measured accurately. Furthermore, ovarian rendering has clinical significance in terms of assessing ovarian anomalies (ovarian surface epithelial cells). Thus if the spacing in the US volume is high along one direction, reducing the spacing would greatly help in both the accurate measurement of the ovarian volume as well as surface assessment. In this paper, we aim to address this problem by developing a deep learning method for super-resolving 3D US data along the axial direction. On the collected dataset, our method has achieved high PSNR (Peak Signal-to-Noise Ratio) and SSIM (Structural Similarity Index) values, and has also resulted in a 54% improvement in ovarian volume computation accuracy. Furthermore, our solution has improved the quality of the 3D rendering of the ovary, and has also reduced the problem of fused follicles in segmentation. This proves the viability of our approach for clinical diagnostic assessment.
Fusion of pre-operative Magnetic Resonance Imaging (MRI) and Trans-Rectal Ultrasound (TRUS) guided biopsy (Fusion Biopsy) has proven to be more effective as compared to cognitive biopsy for the detection of prostate cancer. The detection of the biopsy needle used during the Ultrasound procedure has multiple applications like reporting, repeat biopsy planning and planning therapy. Earlier methods to solve this problem have only used image processing techniques like Hough- Transform or Graph-Cut. These techniques lack robustness because only image-based solution cannot take care of the huge variability in the data as well as the problem of needle going out of plane. Recent deep learning (DL) based solutions for needle detection have high latency and does not exploit temporal information present in TRUS imaging. In this paper, we propose a method to automatically detect the short-lived needle triggers and its position using temporal context incorporated into a DL model termed as Samsung Multi-Decoder Network (S-MDNet). The proposed solution has been tested on 8 patients and yields high sensitivity (96%) and specificity (95%) for the detection of the needle trigger event.
Fusion biopsy reduces false negative rates in prostatic cancer detection compare to systemic biopsy. However, accuracy in biopsy sampling depends upon quality of alignment between pre-operative 3D MR and intra-operative 2D US. During live biopsy, the US-MR alignment may be disturbed due to prostate or patient rigid motion. Further, prostate gland deform due to probe pressure, which add error in biopsy sampling. In this paper, we describe a method for real-time 2D-3D multimodal registration, utilizing deep learning, to correct for rigid and deformable errors. Our method do not require an intermediate 3D US and works in real-time with an average runtime of 112 ms for both rigid and deformable corrections. On 12 patient data, our method reduces mean trans-registration error (TRE) from 8.890±5.106 mm to 2.988±1.513 mm, comparable to other state of the arts in accuracy.
Follicle quantification refers to the computation of the number and size of follicles in 3D ultrasound volumes of the
ovary. This is one of the key factors in determining hormonal dosage during female infertility treatments. In this paper,
we propose an automated algorithm to detect and segment follicles in 3D ultrasound volumes of the ovary for
quantification. In a first of its kind attempt, we employ noise-robust phase symmetry feature maps as likelihood function
to perform mean-shift based follicle center detection. Max-flow algorithm is used for segmentation and gray weighted
distance transform is employed for post-processing the results. We have obtained state-of-the-art results with a true
positive detection rate of >90% on 26 3D volumes with 323 follicles.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.