Open Access
1 March 2024 Direct three-dimensional segmentation of prostate glands with nnU-Net
Author Affiliations +
Abstract

Significance

In recent years, we and others have developed non-destructive methods to obtain three-dimensional (3D) pathology datasets of clinical biopsies and surgical specimens. For prostate cancer risk stratification (prognostication), standard-of-care Gleason grading is based on examining the morphology of prostate glands in thin 2D sections. This motivates us to perform 3D segmentation of prostate glands in our 3D pathology datasets for the purposes of computational analysis of 3D glandular features that could offer improved prognostic performance.

Aim

To facilitate prostate cancer risk assessment, we developed a computationally efficient and accurate deep learning model for 3D gland segmentation based on open-top light-sheet microscopy datasets of human prostate biopsies stained with a fluorescent analog of hematoxylin and eosin (H&E).

Approach

For 3D gland segmentation based on our H&E-analog 3D pathology datasets, we previously developed a hybrid deep learning and computer vision-based pipeline, called image translation-assisted segmentation in 3D (ITAS3D), which required a complex two-stage procedure and tedious manual optimization of parameters. To simplify this procedure, we use the 3D gland-segmentation masks previously generated by ITAS3D as training datasets for a direct end-to-end deep learning-based segmentation model, nnU-Net. The inputs to this model are 3D pathology datasets of prostate biopsies rapidly stained with an inexpensive fluorescent analog of H&E and the outputs are 3D semantic segmentation masks of the gland epithelium, gland lumen, and surrounding stromal compartments within the tissue.

Results

nnU-Net demonstrates remarkable accuracy in 3D gland segmentations even with limited training data. Moreover, compared with the previous ITAS3D pipeline, nnU-Net operation is simpler and faster, and it can maintain good accuracy even with lower-resolution inputs.

Conclusions

Our trained DL-based 3D segmentation model will facilitate future studies to demonstrate the value of computational 3D pathology for guiding critical treatment decisions for patients with prostate cancer.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 International License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Rui Wang, Sarah S. L. Chow, Robert B. Serafin, Weisi Xie, Qinghua Han, Elena Baraznenok, Lydia Lan, Kevin W. Bishop, and Jonathan T. C. Liu "Direct three-dimensional segmentation of prostate glands with nnU-Net," Journal of Biomedical Optics 29(3), 036001 (1 March 2024). https://doi.org/10.1117/1.JBO.29.3.036001
Received: 1 December 2023; Accepted: 9 February 2024; Published: 1 March 2024
Advertisement
Advertisement
KEYWORDS
Image segmentation

3D modeling

Education and training

3D image processing

Prostate

Data modeling

Biopsy

Back to Top