You have requested a machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Neither SPIE nor the owners and publishers of the content make, and they explicitly disclaim, any express or implied representations or warranties of any kind, including, without limitation, representations and warranties as to the functionality of the translation feature or the accuracy or completeness of the translations.
Translations are not retained in our system. Your use of this feature and the translations is subject to all use restrictions contained in the Terms and Conditions of Use of the SPIE website.
12 March 2019Towards an advanced virtual ultrasound-guided renal biopsy trainer
Ultrasound (US)-guided renal biopsy is a critically important tool in the evaluation and management of non-malignant renal pathologies with diagnostic and prognostic significance. It requires a good biopsy technique and skill to safely and consistently obtain high yield biopsy samples for tissue analysis. This project aims to develop a virtual trainer to help clinicians to improve procedural skill competence in real-time ultrasound-guided renal biopsy. This paper presents a cost-effective, high-fidelity trainer built using low-cost hardware components and open source visualization and interactive simulation libraries: interactive medical simulation toolkit (iMSTK) and 3D Slicer. We used a physical mannequin to simulate the tactile feedback that trainees experience while scanning a real patient and to provide trainees with spatial awareness of the US scanning plane with respect to the patient’s anatomy. The ultrasound probe and biopsy needle were modeled using commonly used clinical tools and were instrumented to communicate with the simulator. 3D Slicer was used to visualize an image sliced from a pre-acquired 3-D ultrasound volume based on the location of the probe, with a realistic needle rendering. The simulation engine in iMSTK modeled the interaction between the needle and the virtual tissue to generate visual deformations on the tissue and tactile forces on the needle which are transmitted to the needle that the user holds. Initial testing has shown promising results with respect to quality of simulated images and system responsiveness. Further evaluation by clinicians is planned for the next stage.