9 March 2018 Pseudo dual energy CT imaging using deep learning-based framework: basic material estimation
Author Affiliations +
Abstract
Dual energy computed tomography (DECT) usually scans the object twice using different energy spectrum, and then DECT is able to get two unprecedented material decompositions by directly performing signal decomposition. In general, one is the water equivalent fraction and other is the bone equivalent fraction. It is noted that the material decomposition often depends on two or more different energy spectrum. In this study, we present a deep learning-based framework to obtain basic material images directly form single energy CT images via cascade deep convolutional neural networks (CD-ConvNet). We denote this imaging procedure as pseudo DECT imaging. The CD-ConvNet is designed to learn the non-linear mapping from the measured energy-specific CT images to the desired basic material decomposition images. Specifically, the output of the former convolutional neural networks (ConvNet) in the CD-ConvNet is used as part of inputs for the following ConvNet to produce high quality material decomposition images. Clinical patient data was used to validate and evaluate the performance of the presented CD-ConvNet. Experimental results demonstrate that the presented CD-ConvNet can yield qualitatively and quantitatively accurate results when compared against gold standard. We conclude that the presented CD-ConvNet can help to improve research utility of CT in quantitative imaging, especially in single energy CT.
© (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Yuting Liao, Yongbo Wang, Sui Li, Ji He, Dong Zeng, Zhaoying Bian, Jianhua Ma, "Pseudo dual energy CT imaging using deep learning-based framework: basic material estimation", Proc. SPIE 10573, Medical Imaging 2018: Physics of Medical Imaging, 105734N (9 March 2018); doi: 10.1117/12.2293237; https://doi.org/10.1117/12.2293237
PROCEEDINGS
5 PAGES


SHARE
Back to Top