6 March 2018 Machine learning for mobile wound assessment
Author Affiliations +
Chronic wounds affect millions of people around the world. In particular, elderly persons in home care may develop decubitus. Here, mobile image acquisition and analysis can provide a good assistance. We develop a system for mobile wound capture using mobile devices such as smartphones. The photographs are acquired with the integrated camera of the device and then calibrated and processed to determine the size of various tissues that are present in a wound, i.e., necrotic, sloughy, and granular tissue. The random forest classifier based on various color and texture features is used for that. These features are Sobel, Hessian, membrane projections, variance, mean, median, anisotropic diffusion, and bilateral as well as Kuwahara filters. The resultant probability output is thresholded using the Otsu technique. The similarity between manual ground truth labeling and the classification is measured. The acquired results are compared to those achieved with a basic technique of color thresholding, as well as those produced by the SVM classifier. The fast random forest was found to produce better results. It is also seen to have a superior performance when the method is applied only to the wound regions having the background subtracted. Mean similarity is 0.89, 0.39, and 0.44 for necrotic, sloughy, and granular tissue, respectively. Although the training phase is time consuming, the trained classifier performs fast enough to be implemented on the mobile device. This will allow comprehensive monitoring of skin lesions and wounds.
© (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Sanmathi Kamath, Sanmathi Kamath, Ekaterina Sirazitdinova, Ekaterina Sirazitdinova, Thomas M. Deserno, Thomas M. Deserno, } "Machine learning for mobile wound assessment", Proc. SPIE 10579, Medical Imaging 2018: Imaging Informatics for Healthcare, Research, and Applications, 1057917 (6 March 2018); doi: 10.1117/12.2293704; https://doi.org/10.1117/12.2293704

Back to Top