Traditional diagnostic modalities have been, for the most part, static two-dimensional images displayed on film
or computer screen. More recent diagnostic modalities are solely computer-based and consist of large data-sets
of multiple images. Image perception and visual search using these new modalities are complicated by the need
to interact with the computer in order to navigate through the data. This paper reports the late-breaking results
from two small studies into visual search within two types of CT Colonography (CTC) visualisations. The twelve
novice observers in the study were taking part in a week-long course in CTC and were tested at the beginning
and end of the course. A number of expert observers were also recorded. The two visualisations used in the
study were 2D axial view and 3D colon fly-through. In both cases, searching was performed by inspecting the
colon wall, but by two distinct mechanisms. The first study recorded observer eye-gaze and image navigation in
a CTC axial view. The search strategy was to follow the lumen of the colon and detect abnormalities in the colon
wall. The observer used the physical computer interface to navigate through the set of axial images to perform
this task. The 3D fly-through study recorded observer eye-gaze whilst watching a recording of a computed flight
through the colon lumen. Unlike the axial view there was no computer control, so inspection of the colon surface
was dictated by the speed of flight through the colon.
Four observer groups with different levels of expertise were tested to determine the effect of feedback on eye movements and accuracy whilst performing a simple radiological task. The observer groups were 8 experts, 9 year 1 radiography students, 9 year 3 radiography students, and 10 naive observers (psychology students). The task was fracture detection in the wrist. A test bank of 32 films was compiled with 14 normals, 6 grade 1 fractures (subtle appearance), 6 grade 2 fractures, and 6 grade 3 fractures (obvious appearance). Eye tracking was carried out on all observers to demonstrate differences in visual activity. Observers were asked to rate their confidence in their decision on a ten point scale. Feedback was presented to the observers in the form of circles displayed on the film where fixations had occurred, the size of which was proportional to the length of fixation. Observers were asked to repeat their decision rating. Accuracy was determined by ROC analysis and the area under the curve (AUC). In two groups, the novices and first year radiography students, the feedback resulted in no significant difference in the AUC. In the other two groups, experts (p = 0.002) and second year radiography students (p = 0.031), feedback had a negative effect on performance. The eye tracking parameters were measured for all subjects and compared. This is work in progress, but initial analysis of the data suggests that in a simple radiological task such as fracture detection, where search is very limited, feedback by encouraging observers to look harder at the image can have a negative effect on image interpretation performance, however for the novice feedback is beneficial as post feedback eye-tracking parameters measured more closely matched those of the experts.
This paper describes a software framework and analysis tool to support the collection and analysis of eye movement and perceptual feedback data for a variety of diagnostic imaging modalities. The framework allows the rapid creation of experiment software that can display a collection of medical images of a particular modality, capture eye trace data, and record marks added to an image by the observer, together with their final decision. There are also a number of visualisation techniques for the display of eye trace information. The analysis tool supports the comparison of individual eye traces for a particular observer or traces from multiple observers for a particular image. Saccade and fixation data can be visualised, with user control of fixation identification functions and properties. Observer markings are displayed, and predefined regions of interest are supported. The software also supports some interactive and multi-image modalities. The analysis tool includes a novel visualisation of scan paths across multi-image modalities. Using an exploded 3D view of a stack of MRI scan sections, an observer's scan path can be shown traversing between images, in addition to inspecting them.