Translator Disclaimer
13 January 2012 Fusion of inertial and vision data for accurate tracking
Author Affiliations +
We present a sensor fusion framework for real-time tracking applications combining inertial sensors with a camera. In order to make clear how to exploit the information in the inertial sensor, two different fusion models gyroscopes only model and accelerometers model are presented under extended Kalman filter framework. Gyroscopes only model uses gyroscopes to support the vision-based tracking without considering acceleration measurements. Accelerometers model utilizes both measurements from the gyroscopes, accelerometers and vision data to estimate the camera pose, velocity, acceleration and sensor biases. Synthetic data and real image experimental sequences show dramatic improvements in tracking stability and robustness of estimated motion parameters for gyroscope model, when the accelerometer measurements exist drift.
© (2012) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Jing Chen, Wei Liu, Yongtian Wang, and Junwei Guo "Fusion of inertial and vision data for accurate tracking", Proc. SPIE 8349, Fourth International Conference on Machine Vision (ICMV 2011): Machine Vision, Image Processing, and Pattern Analysis, 83491D (13 January 2012);

Back to Top