16 April 2014 Saliency location based on color contrast
Author Affiliations +
Proceedings Volume 9159, Sixth International Conference on Digital Image Processing (ICDIP 2014); 91591O (2014) https://doi.org/10.1117/12.2064429
Event: Sixth International Conference on Digital Image Processing, 2014, Athens, Greece
Abstract
Generally, the purpose of saliency detection models for saliency object detection and for fixation prediction is complementary. Saliency detection models for saliency object detection aim to discover as much as possible true positive, while saliency detection models for fixation prediction intend to generate few false positive. In this work, we attempt to combine their strength together. We accomplish this by, firstly, replacing high-level features that frequently used in a fixation prediction model with our new saliency location map in order to make the model more general. Secondly, we train a saliency detection model with human eye tracking data in order to make the model correspond well to the human eye fixation (without the use of top-down attention). We evaluate the performance of our new saliency location map on both saliency detection and fixation prediction datasets in comparison with six state-of-the-art saliency detection models. The experimental results show that the performance of our proposed method is superior to other methods in an application of saliency object detection on MSRA dataset [1]. For fixation prediction application, the results show that our saliency location map performs comparable to the high-level features, but requires much less computation time.
© (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Suchat Vijanprecha, Suchat Vijanprecha, Pakaket Wattuya, Pakaket Wattuya, } "Saliency location based on color contrast", Proc. SPIE 9159, Sixth International Conference on Digital Image Processing (ICDIP 2014), 91591O (16 April 2014); doi: 10.1117/12.2064429; https://doi.org/10.1117/12.2064429
PROCEEDINGS
5 PAGES


SHARE
Back to Top