Concerns over the risks of radiation dose from diagnostic CT motivated the utilization of low dose CT (LdCT). However, due to the extremely low X-ray photon statistics in LdCT, the reconstruction problem is ill-posed and noisecontaminated. Conventional Compressed Sensing (CS) methods have been investigated to enhance the signal-to-noise ratio of LdCT at the cost of image resolution and low contrast object visibility. In this work, we adapted a flexible, iterative reconstruction framework, termed Plug-and-Play (PnP) alternating direction method of multipliers (ADMM), that incorporated state-of-the-art denoising algorithms into model-based image reconstruction. The PnP ADMM framework is achieved by combining a least square data fidelity term with a regularization term for image smoothness and was solved through the ADMM. An off-the-shelf image denoiser, the Block-Matching 3D-transform shrinkage (BM3D) filter, is plugged in to substitute an ADMM module. The PnP ADMM was evaluated on low dose scans of ACR 464 phantom and two lung screening data sets and is compared with the Filtered Back Projection (FBP), the Total Variation (TV), the BM3D post-processing method, and the BM3D regularization method. The proposed framework distinguished the line pairs at 9 lp/cm resolution on the ACR phantom and the fissure line in the left lung, resolving the same or better image details than FBP reconstruction of higher dose scans with up to 18 times less dose. Compared with conventional iterative reconstruction methods resulting in comparable image noise, the proposed method is significantly better at recovering image details and improving low contrast conspicuity.
Iterative coordinate descent (ICD) is an optimization strategy for iterative reconstruction that is sometimes considered incompatible with parallel compute architectures such as graphics processing units (GPUs). We present a series of modifications that render ICD compatible with GPUs and demonstrate the code on a diagnostic, helical CT dataset. Our reference code is an open-source package, FreeCT ICD, which requires several hours for convergence. Three modifications are used. First, as with our reference code FreeCT ICD, the reconstruction is performed on a rotating coordinate grid, enabling the use of a stored system matrix. Second, every other voxel in the z-is updated direction simultaneously, and the sinogram data is shuffled to coalesce memory access. This increases the parallelism available to the GPU. Third, NS voxels in the xy-plane are updated simultaneously. This introduces possible crosstalk between updated voxels, but because the interaction between non-adjacent voxels is small, small values of NS still converge effectively. We find NS = 16 enables faster reconstruction via greater parallelism, and NS = 256 remains stable but has no additional computational benefit. When tested on a pediatric dataset of size 736x16x14000 reconstructed to a matrix size of 512x512x128 on a single GPU, our implementation of ICD can converge within 10 HU RMS in less than 5 minutes. This suggests that ICD could be competitive with simultaneous update algorithms on modern, parallel compute architectures.
Translation of radiomics into clinical practice requires confidence in its interpretations. This may be obtained via understanding and overcoming the limitations in current radiomic approaches. Currently there is a lack of standardization in radiomic feature extraction. In this study we examined a few factors that are potential sources of inconsistency in characterizing lung nodules, such as 1)different choices of parameters and algorithms in feature calculation, 2)two CT image dose levels, 3)different CT reconstruction algorithms (WFBP, denoised WFBP, and Iterative). We investigated the effect of variation of these factors on entropy textural feature of lung nodules. CT images of 19 lung nodules identified from our lung cancer screening program were identified by a CAD tool and contours provided. The radiomics features were extracted by calculating 36 GLCM based and 4 histogram based entropy features in addition to 2 intensity based features. A robustness index was calculated across different image acquisition parameters to illustrate the reproducibility of features. Most GLCM based and all histogram based entropy features were robust across two CT image dose levels. Denoising of images slightly improved robustness of some entropy features at WFBP. Iterative reconstruction resulted in improvement of robustness in a fewer times and caused more variation in entropy feature values and their robustness. Within different choices of parameters and algorithms texture features showed a wide range of variation, as much as 75% for individual nodules. Results indicate the need for harmonization of feature calculations and identification of optimum parameters and algorithms in a radiomics study.
Quantitative imaging in lung cancer CT seeks to characterize nodules through quantitative features, usually from a region of interest delineating the nodule. The segmentation, however, can vary depending on segmentation approach and image quality, which can affect the extracted feature values. In this study, we utilize a fully-automated nodule segmentation method – to avoid reader-influenced inconsistencies – to explore the effects of varied dose levels and reconstruction parameters on segmentation.
Raw projection CT images from a low-dose screening patient cohort (N=59) were reconstructed at multiple dose levels (100%, 50%, 25%, 10%), two slice thicknesses (1.0mm, 0.6mm), and a medium kernel. Fully-automated nodule detection and segmentation was then applied, from which 12 nodules were selected. Dice similarity coefficient (DSC) was used to assess the similarity of the segmentation ROIs of the same nodule across different reconstruction and dose conditions.
Nodules at 1.0mm slice thickness and dose levels of 25% and 50% resulted in DSC values greater than 0.85 when compared to 100% dose, with lower dose leading to a lower average and wider spread of DSC values. At 0.6mm, the increased bias and wider spread of DSC values from lowering dose were more pronounced. The effects of dose reduction on DSC for CAD-segmented nodules were similar in magnitude to reducing the slice thickness from 1.0mm to 0.6mm. In conclusion, variation of dose and slice thickness can result in very different segmentations because of noise and image quality. However, there exists some stability in segmentation overlap, as even at 1mm, an image with 25% of the lowdose scan still results in segmentations similar to that seen in a full-dose scan.
Lung cancer screening using low dose CT has been shown to reduce lung cancer related mortality and been approved for widespread use in the US. These scans keep radiation doses low while maximizing the detection of suspicious lung lesions. Tube current modulation (TCM) is one technique used to optimize dose, however limited work has been done to assess TCM’s effect on detection tasks. In this work the effect of TCM on detection is investigated throughout the lung utilizing several different model observers (MO). 131 lung nodules were simulated at 1mm intervals in each lung of the XCAT phantom. A Sensation 64 TCM profile was generated for the XCAT phantom and 2500 noise realizations were created using both TCM and a fixed TC. All nodules and noise realizations were reconstructed for a total of 262 (left and right lungs) nodule reconstructions and 10 000 XCAT lung reconstructions. Single-slice Hotelling (HO) and channelized Hotelling (CHO) observers, as well as a multislice CHO were used to assess area-under-the-curve (AUC) as a function of nodule location in both the fixed TC and TCM cases. As expected with fixed TC, nodule detectability was lowest through the shoulders and leveled off below mid-lung; with TCM, detectability was unexpectedly highest through the shoulders, dropping sharply near the mid-lung and then increasing into the abdomen. Trends were the same for all model observers. These results suggest that TCM could be further optimized for detection and that detectability maps present exciting new opportunities for TCM optimization on a patient-specific level.
Lung cancer screening CT is already performed at low dose. There are many techniques to reduce the dose even further, but it is not clear how such techniques will affect nodule detectability. In this work, we used an in-house CAD algorithm to evaluate detectability. 90348 patients and their raw CT data files were drawn from the National Lung Screening Trial (NLST) database. All scans were acquired at ~2 mGy CTDIvol with fixed tube current, 1 mm slice thickness, and B50 reconstruction kernel on a Sensation 64 scanner (Siemens Healthcare). We used the raw CT data to simulate two additional reduced-dose scans for each patient corresponding to 1 mGy (50%) and 0.5 mGy (25%). Radiologists’ findings on the NLST reader forms indicated 65 nodules in the cohort, which we subdivided based on LungRADS criteria. For larger category 4 nodules, median sensitivities were 100% at all three dose levels, and mean sensitivity decreased with dose. For smaller nodules meeting the category 2 or 3 criteria, the dose dependence was less obvious. Overall, mean patient-level sensitivity varied from 38.5% at 100% dose to 40.4% at 50% dose, a difference of only 1.9%. However, the false-positive rate quadrupled from 1 per case at 100% dose to 4 per case at 25% dose. Dose reduction affected lung-nodule detectability differently depending on the LungRADS category, and the false-positive rate was very sensitive at sub-screening dose levels. Thus, care should be taken to adapt CAD for the very challenging noise characteristics of screening.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print format on
SPIE.org.