PURPOSE: Prostate cancer is the second most common cancer diagnosed in men. The rate is disproportionately high among men in sub-Saharan Africa where, unlike in North America and Western Europe, the screening process for prostate cancer has historically not been routine. Currently, as awareness regarding prostate health increases, more patients in this region are being referred to trans-rectal ultrasound guided prostate biopsy, a diagnosis procedure which requires a strong understanding of prostate zonal anatomy. To aid in the instruction of this procedure, prostate biopsy training programs need to be implemented. Unfortunately, current TRUS-guided training tools are not ideal for reproducibility in these Western African countries. To answer this challenge, we are developing an affordable and open-source training simulator for TRUS-guided prostate biopsy, for use in Senegal. In this paper, we present the implementation of the training simulator’s virtual interface, highlighting the generation and evaluation of the critical training component of zonal anatomy overlaid on TRUS. METHODS: For the simulator’s dataset, we registered TRUS and MRI volumes together to obtain the zonal segmentation from the MRI volumes. After generating ten pairings of TRUS overlaid with zonal segmentation, we designed and implemented a virtual TRUS training system, developed in open-source software. The objective of our simulator is to teach trainees to accurately identify the prostate’s anatomical zones in TRUS. To confirm the system’s usability for training zonal identification, we conducted a two-part survey on the quality of the zonal overlays with 7 urology experts. In the first part, they assessed the zonal overlay for visual correctness by rating 10 images from one patient’s TRUS with registered overlay on a 5-point Likert scale. For the second part, they labelled 10 plain TRUS volumes with zonal anatomy and the labels were compared to the labels of our overlay. RESULTS: On average, experts rated the zonal overlay’s visual accuracy at 4 out of 5. Furthermore, 7 out of 7 experts labelled the peripheral, anterior, and transitional zones in the same regions we overlaid them, and 5 out of 7 labelled the central zone in the same region we overlaid it. CONCLUSION: We created the prototype of a TRUS imaging simulator in open-source software. A vital training component, zonal overlay, was generated using publicly accessible data and validated by expert urologists for prostate zone identification, confirming the concept.
PURPOSE: Virtual reality (VR) simulation is an effective training system for medical residents, allowing them to gain and improve upon surgical skills in a realistic environment while also receiving feedback on their performance. Percutaneous nephrolithotomy is the most common surgical treatment for the removal of renal stones. We propose a workflow to generate 3D soft tissue and bone models from computed tomography (CT) images, to be used and validated in a VR nephrolithotomy simulator. METHODS: Venous, delay, non-contrast, and full body CT scans were registered and segmented to generate 3D models of the abdominal organs, skin, and bone. These models were decimated and re-meshed into low-polygon versions while maintaining anatomical accuracy. The models were integrated into a nephrolithotomy simulator with haptic feedback and scoring metrics. Urology surgical experts assessed the simulator and its validity through a questionnaire based on a 5-point Likert scale. RESULTS: The workflow produced soft tissue and bone models from patient CT scans, which were integrated into the simulator. Surgeon responses indicated level 3 and above for face validity and level 4 and above for all other aspects of medical simulation validity: content, construct, and criterion. CONCLUSION: We designed an effective workflow to generate 3D models from CT scans using open source and modelling software. The low resolution of these models allowed integration in a VR simulator for visualization and haptic feedback, while anatomical accuracy was maintained.