26 September 1997 MARR: active vision model
Author Affiliations +
Abstract
Earlier, the biologically plausible active vision, model for multiresolutional attentional representation and recognition (MARR) has been developed. The model is based on the scanpath theory of Noton and Stark and provides invariant recognition of gray-level images. In the present paper, the algorithm of automatic image viewing trajectory formation in the MARR model, the results of psychophysical experiments, and possible applications of the model are considered. Algorithm of automatic image viewing trajectory formation is based on imitation of the scanpath formed by operator. Several propositions about possible mechanisms for a consecutive selection of fixation points in human visual perception inspired by computer simulation results and known psychophysical data have been tested and confirmed in our psychophysical experiments. In particular, we have found that gaze switch may be directed (1) to a peripheral part of the vision field which contains an edge oriented orthogonally to the edge in the point of fixation, and (2) to a peripheral part of the vision field containing crossing edges. Our experimental results have been used to optimize automatic algorithm of image viewing in the MARR model. The modified model demonstrates an ability to recognize complex real world images invariantly with respect to scale, shift, rotation, illumination conditions, and, in part, to point of view and can be used to solve some robot vision tasks.
© (1997) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Lubov N. Podladchikova, Valentina I. Gusakova, Dmitry G. Shaposhnikov, Alain Faure, Alexander V. Golovan, Natalia A. Shevtsova, "MARR: active vision model", Proc. SPIE 3208, Intelligent Robots and Computer Vision XVI: Algorithms, Techniques, Active Vision, and Materials Handling, (26 September 1997); doi: 10.1117/12.290313; https://doi.org/10.1117/12.290313
PROCEEDINGS
8 PAGES


SHARE
RELATED CONTENT


Back to Top