Accurate classification and precise quantification of interstitial lung disease (ILD) types on CT images remain important challenges in clinical diagnosis. Multi-modality image information is required to assist diagnosing diseases. To build scalable deep-learning solutions for this problem, how to take full advantage of existing large-scale datasets in modern hospitals has become a critical task. In this paper, we present <i>DeepILD</i>, as a novel computer-aided diagnostic framework to address the ILD classification task only from single modality (CT image) using a deep neural network. More specifically, we propose integrating spherical semi-supervised K- means clustering and convolutional neural networks for ILD classification and disease quantification. We firstly use semi-supervised spherical K-means to divide the CT lung area into normal and abnormal sub-regions. A convolutional neural network (CNN) is subsequently invoked to perform training using image patches extracted from the abnormal regions. Here, we focus on the classification of three chronic fibrosing ILD types: idiopathic pulmonary fibrosis (IPF), idiopathic non-specific interstitial pneumonia (iNSIP), and chronic hypersensitivity pneumonia (CHP). Excellent classification accuracy has been achieved using a dataset of 188 CT scans; in particular, our IPF classification reached about 88% accuracy.
In this paper, we present an efficient trainable conditional random field (CRF) model using a newly proposed scale-targeted loss function to improve the segmentation accuracy on tiny blood vessels in 3D medical images. Blood vessel segmentation is still a big challenge in medical image processing field due to its elongated structure and low contrast. Conventional local neighboring CRF model has poor segmentation performance on tiny elongated structures due to its poor capability capturing pairwise potentials. To overcome this drawback, we use a fully-connected CRF model to capture the pairwise potentials. This paper also introduces a new scale-targeted loss function aiming to improve the segmentation accuracy on tiny blood vessels. Experimental results on both phantom data and clinical CT data showed that the proposed approach contributes to the segmentation accuracy on tiny blood vessels. Compared to previous loss function, our proposed loss function improved about 10% sensitivity on phantom data and 14% on clinical CT data.
This paper presents a novel renal artery segmentation method combining graph-cut and template-based tracking methods and its application to estimation of renal vascular dominant region. For the purpose of giving a computer assisted diagnose for kidney surgery planning, it is important to obtain the correct topological structures of renal artery for estimation of renal vascular dominant regions. Renal artery has a low contrast, and its precise extraction is a difficult task. Previous method utilizing vesselness measure based on Hessian analysis, still cannot extract the tiny blood vessels in low-contrast area. Although model-based methods including superellipsoid model or cylindrical intensity model are low-contrast sensitive to the tiny blood vessels, problems including over-segmentation and poor bifurcations detection still remain. In this paper, we propose a novel blood vessel segmentation method combining a new Hessian-based graph-cut and template modeling tracking method. Firstly, graph-cut algorithm is utilized to obtain the rough segmentation result. Then template model tracking method is utilized to improve the accuracy of tiny blood vessel segmentation result. Rough segmentation utilizing graph-cut solves the bifurcations detection problem effectively. Precise segmentation utilizing template model tracking focuses on the segmentation of tiny blood vessels. By combining these two approaches, our proposed method segmented 70% of the renal artery of 1mm in diameter or larger. In addition, we demonstrate such precise segmentation can contribute to divide renal regions into a set of blood vessel dominant regions utilizing Voronoi diagram method.