Significance: Melanoma is a deadly cancer that physicians struggle to diagnose early because they lack the knowledge to differentiate benign from malignant lesions. Deep machine learning approaches to image analysis offer promise but lack the transparency to be widely adopted as stand-alone diagnostics.
Aim: We aimed to create a transparent machine learning technology (i.e., not deep learning) to discriminate melanomas from nevi in dermoscopy images and an interface for sensory cue integration.
Approach: Imaging biomarker cues (IBCs) fed ensemble machine learning classifier (Eclass) training while raw images fed deep learning classifier training. We compared the areas under the diagnostic receiver operator curves.
Results: Our interpretable machine learning algorithm outperformed the leading deep-learning approach 75% of the time. The user interface displayed only the diagnostic imaging biomarkers as IBCs.
Conclusions: From a translational perspective, Eclass is better than convolutional machine learning diagnosis in that physicians can embrace it faster than black box outputs. Imaging biomarkers cues may be used during sensory cue integration in clinical screening. Our method may be applied to other image-based diagnostic analyses, including pathology and radiology.
Early diagnosis of melanomas is the most effective means of improving melanoma prognosis. We can arm the non-expert screeners with artificial intelligence but most artificial intelligence methods are somewhat impractical in a clinical setting given the lack of transparency. To provide a quantitative and algorithmic approach to lesion diagnosis while maintaining transparency, and to supplement the clinician rather than replace them, our digital analysis provides visual features, or, “imaging biomarkers” that can both be used in machine learning and visualized too.