17 March 2015 Texture synthesis models and material perception in the visual periphery
Author Affiliations +
Abstract
The feature vocabularies used to support texture synthesis algorithms are increasingly being used to examine various aspects of human visual perception. These algorithms offer both a rich set of features that are typically sufficient to capture the appearance of complex natural inputs and a means of carrying out psychophysical experiments using synthetic textures as a proxy for the transformations ostensibly carried out by the visual system when processing natural images using summary statistics. Texture synthesis algorithms have recently been successfully applied to a wide range of visual tasks including texture perception, visual crowding, visual search, among others. Presently, we used both nonparametric and parametric texture synthesis models to investigate the nature of material perception in the visual periphery. We asked participants to classify images of four natural materials (metal, stone, water, and wood) when briefly presented in the visual periphery and compared the errors made under these viewing conditions to the errors made when judging the material category of synthetic images made from the original targets. We found that the confusions made under these two scenarios were substantially different, suggesting that these particular models do not appear to account for material perception in the periphery.
© (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Benjamin Balas, Benjamin Balas, } "Texture synthesis models and material perception in the visual periphery", Proc. SPIE 9394, Human Vision and Electronic Imaging XX, 93940H (17 March 2015); doi: 10.1117/12.2084320; https://doi.org/10.1117/12.2084320
PROCEEDINGS
6 PAGES


SHARE
Back to Top