Translator Disclaimer
1 September 2008 Multiparameter classifications of optical tomographic images
Author Affiliations +
This research study explores the combined use of more than one parameter derived from optical tomographic images to increase diagnostic accuracy which is measured in terms of sensitivity and specificity. Parameters considered include, for example, smallest or largest absorption or scattering coefficients or the ratios thereof in an image region of interest. These parameters have been used individually in a previous study to determine if a finger joint is affected or not affected by rheumatoid arthritis. To combine these parameters in the analysis we employ here a vector quantization based classification method called Self-Organizing Mapping (SOM). This method allows producing multivariate ROC-curves from which sensitivity and specificities can be determined. We found that some parameter combinations can lead to higher sensitivities whereas others to higher specificities when compared to singleparameter classifications employed in previous studies. The best diagnostic accuracy, in terms of highest Youden index, was achieved by combining three absorption parameters [maximum (µa), minimum (µa), and the ratio of minimum (µa) and maximum (µa)], which result in a sensitivity of 0.78, a specificity of 0.76, a Youden index of 0.54, and an area under the curve (AUC) of 0.72. These values are higher than for previously reported single parameter classifications with a best sensitivity and specificity of 0.71, a Youden index of 0.41, and an AUC of 0.66.



Sagittal laser optical tomography data (SLOT) has recently been employed to detect rheumatoid arthritis (RA) in finger joints.2 The authors investigated μa and μs images for 78 different finger joints, including data from patients diagnosed with RA as well as healthy volunteers. They extracted various single parameters such as minimum or maximum values of μa and μs in an area of interest within a given image to determine what parameter provides the best distinction between affected and not affected joints. Statistical analysis of the data revealed that the minimum μa value yielded the best differentiation between affected and not affected joints. In this case, sensitivity and specificity values of approximately 0.71 were obtained with a statistical significance of p=0.01 . However, no attempt has been made so far to combine two or more parameters in the process of computer aided diagnostics.1

This comparative study explores whether the use of more than one parameter can yield better classification results. It is speculated that combining multiple parameters, for example, minimum and maximum μa in an image, increases sensitivity and specificity. To test this hypothesis the same dataset is used, described by Scheel 2



Scheel described the acquisition of the sagittal laser optical tomography (SLOT) data.2 Data were based on tomographic reconstructions of optical properties in 2-D sagittal cross sections through the proximal interphalangeal (PIP) finger joint.3, 4, 5 Generally, SLOT images show the spatial distribution of two different optical properties: the absorption coefficient μa and scattering coefficient μs . A region of interest (ROI) was consistently defined within each image to extract parameters for classification purposes [Fig. 1a ]. A search algorithm eliminated data in the first 4mm on the top and bottom of each image and 7mm on the left and right. In this way, the chosen ROI did not contain potential image artifacts, which are often encountered near source and detector positions. Within the ROI, six different parameters were extracted, including the smallest absorption coefficient min(μa) , the largest absorption coefficient max(μa) , the smallest scattering coefficient min(μs) , the largest scattering coefficient max(μs) , and the ratios min(μa)max(μa) and min(μs)max(μs) .

Fig. 1

Example of a SLOT tomographic image with region of interest of the scattering coefficient for a RA affected finger joint.


In total, image data from 78 PIP joints were evaluated. Using ultrasound imaging (US) as a gold standard, 37 joints were identified as not affected and 41 joints were identified as affected by RA. For both groups of fingers, the mean values and standard deviations were calculated for all six optical parameters. Student t-tests were performed to determine if there is a statistically significant difference in the mean for each parameter between affected and not affected groups. In addition, a receiver operating characteristic (ROC) analysis was performed for each parameter, and sensitivity and specificity values were determined.6

Overall, the authors found that the minimal absorption coefficient min(μa) in the ROI for each image indicates the most significant differences between the affected and not affected groups.2 The difference between the mean values showed the smallest p -value of 0.01, and the ROC analysis showed sensitivities and specificity values of 0.71. All other parameters showed lower sensitivity, specificity, and p -values.



To deal with the problem of multiparameter classification, this research work employed the neural-network-based method of self-organizing mapping (SOM). This method has been used in the past in other scientific fields for similar classification problems7, 8, 9 and has shown to produce significantly better results than approaches such as discriminant analysis and logistic regression.10, 11 This classification technique was originally developed as a physical-mathematical model to mimic the human visual system.12, 13 Since medical images are in general visually interpreted by experts, the SOM method appears particularly suited for this task.

SOM is an unsupervised learning method. Its main purpose is the transformation of a feature vector of arbitrary dimension drawn from the given feature space [e.g., min(μa) , max(μa) , and the ratio min(μa)max(μa) ] into simplified generally 2-D discrete maps. A practical overview about the general structure and the learning/training process is described by Klose9 and in greater detail by Kohonen.13

Generally, each n -dimensional feature vector is presented to all neurons of the input layer and typically activates (stimulates) one neuron (classifier) in the Kohonen layer [Fig. 2a ]. Similar/dissimilar input data were represented by the neighboring/distant neurons after a classification [Fig. 2b]. One neuron could even classify several input vectors, if these input vectors were very similar in comparison to other input vectors. Each neuron in the Kohonen layer identifies a certain number or frequency F of target class members of a given gold standard, such as class affected and class not affected [Fig. 2c]. The approach of using a frequency threshold FT varying between 0 and 1 makes it possible to outline ROC-curve analyses determining sensitivities, specificities, and the Youden index as classification performance measures.14 In detail, the sensitivity is the number of truly identified affected fingers relatively to the number of both truly identified affected fingers and falsely identified not affected fingers. The specificity is the number of falsely identified affected fingers relatively to the number of both truly identified not affected fingers and falsely identified affected fingers. Sensitivity and specificity can also be expressed as true positive rate (TPR=sensitivity) and false positive rate (FPR=1-specificity) .

Fig. 2

Scheme for multiparameter classifications based on self-organizing maps (SOM): (a) Structure of a SOM neural network, (b) image of active neurons representing the class affected within the Kohonen layer after discrimination of the given input vectors, and (c) frequency determination and final classification of the classes affected (black) and not affected (gray). A frequency threshold FT varying between 0 and 100% can be used to determine multiparameter ROC curves (Fig. 3).


Fig. 3

ROC curves for best one-parameter analysis of [min(μa)] with specificity=0.71 , sensitivity=0.71 , Youden index=0.41 , AUC=0.66 , and p-value=0.01 ; and best case using multiparameter analysis of [ max(μa) , min(μa) , min(μa)max(μa) ] with specificity=0.78 , sensitivity=0.76 , Youden index=0.54 , AUC=0.72 , and p-value=0.016 .



Results and Discussion

The classification problem for rheumatic arthritis in finger joints was conducted using all possible combinations of two or more of the six original parameters [ min(μa) , max(μa) , min(μa)max(μa) , min(μs) , max(μs) , min(μs)max(μs) ]. The parameter combinations that lead to the eight highest sensitivities and specificities are shown in Table 1 . Combinations involving only μs -derived parameters yield in general lower sensitivity and specificity than the best-case single-parameter classification. An exception is the combination of [ max(μs) , min(μs) , min(μs)max(μs) ], which yields a sensitivity of 0.68 and a specificity of 0.82.. In contrast, combinations involving only μa -derived parameters yield a higher specificity than the single-parameter classification. The combination of [ max(μa) , min(μa) ] results in the highest specificity (0.87) found in the entire study. When three μa -derived parameters are combined [ max(μa) , min(μa) , min(μa)max(μa) ], the best overall classification is achieved. This combination yielded an area under the curve of AUC=0.72 and a Youden index J=0.54 , clearly higher than the single-parameter classification [using only min(μa) ] with AUC=0.66 and J=0.41 , reported earlier by Scheel using the same SLOT images. This study achieved sensitivity and specificity values of 0.78 and 0.76, with a p -value of 0.016. A comparison of this case and the best single-parameter case is shown in Fig. 3 . Clearly visible is how the combination of these three parameters shifts the curve to the upper left quadrant.

Table 1

Combination of optical parameters used in the multiparameter SOM classification that leads to the eight highest sensitivities and specificities. The first row shows the best single-parameter classification reported by Scheel 2

Parameter combination Ultrasound diagnosis (US)
[min(μa)] (Scheel)0.710.71
[ max(μa) , min(μa) , min(μa)max(μa) ]0.760.78
[ max(μa) , min(μa) ]0.640.87
[ min(μa) , min(μa)max(μa) ]0.600.84
[ max(μa) , min(μa)max(μa) ]0.520.84
[ max(μs) , min(μs) , min(μs)max(μs) ]0.680.82
[ max(μs) , min(μs) ]0.560.66
[ max(μs) , min(μs)max(μs) ]0.690.60
[ min(μs) , min(μs)max(μs) ]0.690.64

Moreover, it appears that, independent from the size of the SOM neural network, the classification quality is uncorrelated to the dimensionality of the parameter space. Combinations of more than three parameters between the absorption and scattering coefficient may lead to much smaller sensitivities and specificities. However, this study shows that computer aided diagnostics of optical tomographic images for detecting rheumatic arthritis in finger joints is attractive for clinical use when using multiple parameters. Further investigation is needed to determine the relation between classification quality and dimensionality of the parameter spaces.


The authors thank Scheel, University of Göttingen, Germany, for providing experimental data used in this analysis. This work was supported in part by a grant number 2R01-AR46255 from the National Institute of Arthritis and Musculoskeletal and Skin Diseases (NIAMS), which is part of the National Institutes of Health.



C. D. Klose, A. D. Klose, U. Netz, J. Beuthan, and A. Hielscher, “Multi-parameter optical image interpretations based on self-organizing mapping,” Proc. SPIE, 6850 68500G (2008). 0277-786X Google Scholar


A. K. Scheel, M. Backhaus, A. D. Klose, B. Moa-Anderson, U. Netz, K. G. A. Hermann, J. Beuthan, G. A. Müller, G. R. Burmester, and A. H. Hielscher, “First clinical evaluation of sagittal laser optical tomography for detection of synovitis in arthritic finger joints,” Ann. Rheum. Dis., 64 239 –245 (2005). 0003-4967 Google Scholar


A. D. Klose, J. Beuthan, and G. Mueller, “Investigation of RA-diagnostics applying optical tomography in frequency domain,” Proc. SPIE, 3196 194 –204 (1997). 0277-786X Google Scholar


A. D. Klose, “Optical Tomography Based on the Equation of Radiative Transfer,” Freie Universität Berlin, (2002). Google Scholar


A. H. Hielscher, A. D. Klose, A. Scheel, B. Moa-Anderson, M. Backhaus, U. Netz, and J. Beuthan, “Sagittal laser optical tomography for imaging of rheumatoid finger joints,” Phys. Med. Biol., 49 (7), 1147 –1163 (2004). 0031-9155 Google Scholar


C. E. Metz and X. C. Pan, “Proper binormal ROC curves: theory and maximum likelihood estimation,” J. Math. Psychol., 43 1 –33 (1999). 0022-2496 Google Scholar


A. Pascual-Montano, K. H. Taylor, H. Winkler, R. D. Pascual-Marqui, and J. M. Carazo, “Quantitative self-organizing maps for clustering electron tomograms,” J. Struct. Biol., 138 114 –122 (2002). 1047-8477 Google Scholar


T. W. Nattkemper and A. Wismüller, “Tumor feature visualization with unsupervised learning,” Med. Image Anal., 9 344 –351 (2005). 1361-8415 Google Scholar


C. D. Klose, “Self-organising maps for geoscientific data analysis: geological interpretation of multi-dimensional geophysical data,” Comput. Geosci., 10 (3), 265 –277 (2006). 0098-3004 Google Scholar


R. Schönweiler, P. Wübbelt, R. Tolloczko, C. Rose, and M. Ptok, “Classification of passive auditory event-related potentials using discriminant analysis and self-organizing feature maps,” Audiol. Neuro-Otol., 5 69 –82 (2000). 1420-3030 Google Scholar


R. W. Veltri, M. Chaudhari, M. C. Miller, E. C. Poole, G. J. O’Dowd, and A. W. Partin, “Comparison of logistic regression and neural net modeling for prediction of prostate cancer pathologic stage,” Clin. Chem., 48 (10), 1828 –1834 (2002). 0009-9147 Google Scholar


T. Kohonen, “Self-organizing formation of topologicaly correct feature maps,” Biol. Cybern., 43 (1), 59 –69 (1982). 0340-1200 Google Scholar


T. Kohonen, Self-Organizing Maps, 3rd ed.Springer, Berlin (2001). Google Scholar


W. J. Youden, “Index rating for diagnostic tests,” Cancer, 3 32 –35 (1950). 0008-543X Google Scholar
©(2008) Society of Photo-Optical Instrumentation Engineers (SPIE)
Christian D. Klose, Alexander D. Klose, Uwe J. Netz, Juergen Beuthan, and Andreas H. Hielscher "Multiparameter classifications of optical tomographic images," Journal of Biomedical Optics 13(5), 050503 (1 September 2008).
Published: 1 September 2008

Back to Top