19 May 2005 A robust eye position tracker based on invariant local features, eye motion, and infrared-eye responses
Author Affiliations +
Abstract
Despite the effort made in the eye tracking community, the eye position tracking problem remains unsolved completely, due to the large variations in the eye appearance. This paper describes a multi-modal eye position tracker in dark/bright pupil image sequences. The tracking algorithm consists of detecting meaningful particles that correspond to IR-Pupil responses and eye motion, altering of particles through appearance models in the local invariant descriptor space, and matching of eye neighbors. Experimental validations have shown satisfactory performance in term of precision of eye position estimation, and robustness to 2D head rotations, translations, closed eye states, and reasonable out-of-plane rotations.
© (2005) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Riad I. Hammoud, "A robust eye position tracker based on invariant local features, eye motion, and infrared-eye responses", Proc. SPIE 5807, Automatic Target Recognition XV, (19 May 2005); doi: 10.1117/12.606117; https://doi.org/10.1117/12.606117
PROCEEDINGS
9 PAGES


SHARE
Back to Top