7 March 2018 A deep (learning) dive into visual search behaviour of breast radiologists
Author Affiliations +
Abstract
Visual search, the process of detecting and identifying objects using the eye movements (saccades) and the foveal vision, has been studied for identification of root causes of errors in the interpretation of mammography. The aim of this study is to model visual search behaviour of radiologists and their interpretation of mammograms using deep machine learning approaches. Our model is based on a deep convolutional neural network, a biologically-inspired multilayer perceptron that simulates the visual cortex, and is reinforced with transfer learning techniques.

Eye tracking data obtained from 8 radiologists (of varying experience levels in reading mammograms) reviewing 120 two-view digital mammography cases (59 cancers) have been used to train the model, which was pre-trained with the ImageNet dataset for transfer learning. Areas of the mammogram that received direct (foveally fixated), indirect (peripherally fixated) or no (never fixated) visual attention were extracted from radiologists’ visual search maps (obtained by a head mounted eye tracking device). These areas, along with the radiologists’ assessment (including confidence of the assessment) of suspected malignancy were used to model: 1) Radiologists’ decision; 2) Radiologists’ confidence on such decision; and 3) The attentional level (i.e. foveal, peripheral or none) obtained by an area of the mammogram. Our results indicate high accuracy and low misclassification in modelling such behaviours.
© (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Suneeta Mall, Suneeta Mall, Patrick C. Brennan, Patrick C. Brennan, Claudia Mello-Thoms, Claudia Mello-Thoms, } "A deep (learning) dive into visual search behaviour of breast radiologists", Proc. SPIE 10577, Medical Imaging 2018: Image Perception, Observer Performance, and Technology Assessment, 1057708 (7 March 2018); doi: 10.1117/12.2293366; https://doi.org/10.1117/12.2293366
PROCEEDINGS
11 PAGES


SHARE
RELATED CONTENT


Back to Top