The superior soft-tissue contrast obtained by MRI allows contouring target and organs-at-risk (OARs) accurately and reliably in head-and-neck cancer radiotherapy. However, the delineating of OARs is labor-intensive and subjective. We propose a mask regional convolutional neural network (R-CNN) for fully automated OAR delineation in head-and-neck (HN) MRI. A deep attention feature pyramid network (DAFPN) was used as the backbone to extract pyramid features, while the location and size of each region-of-interest (ROI) are estimated by the feeding features extracted from DAFPN into a regional proposal network (RPN). The final OAR contours were predicted by feeding cropped features within these ROIs (location and size) into a subnetwork to perform segmentation within ROIs. A retrospective study was performed on 45 HN cancer patients who underwent radiation therapy. MR images and their manual contours were collected for training and testing the proposed method. Five-fold cross-validation method was conducted to evaluate the proposed method. The mean Dice similarity coefficients of brain stem, esophagus, larynx, mandible, optic chiasm, left optic nerve, right optic nerve, oral cavity, left parotid, right parotid, pharynx, and spinal cord are 0.88, 0.52, 0.88, 0.77, 0.56, 0.59, 0.57, 0.90, 0.81, 0.81, 0.76, and 0.69, respectively. After the model training, all OAR can be segmented within 10 seconds. The accurate HN OAR delineation enables further development of MRI-only based radiotherapy workflow for HN cancer.
|