3 January 2013 Real-time visual tracking of less textured three-dimensional objects on mobile platforms
Author Affiliations +
Optical Engineering, 51(12), 127202 (2013). doi:10.1117/1.OE.51.12.127202
Abstract
Natural feature-based approaches are still challenging for mobile applications (e.g., mobile augmented reality), because they are feasible only in limited environments such as highly textured and planar scenes/objects, and they need powerful mobile hardware for fast and reliable tracking. In many cases where conventional approaches are not effective, three-dimensional (3-D) knowledge of target scenes would be beneficial. We present a well-established framework for real-time visual tracking of less textured 3-D objects on mobile platforms. Our framework is based on model-based tracking that efficiently exploits partially known 3-D scene knowledge such as object models and a background’s distinctive geometric or photometric knowledge. Moreover, we elaborate on implementation in order to make it suitable for real-time vision processing on mobile hardware. The performance of the framework is tested and evaluated on recent commercially available smartphones, and its feasibility is shown by real-time demonstrations.
© 2012 Society of Photo-Optical Instrumentation Engineers (SPIE)
Byung-Kuk Seo, Jungsik Park, Hanhoon Park, Jong-Il Park, "Real-time visual tracking of less textured three-dimensional objects on mobile platforms," Optical Engineering 51(12), 127202 (3 January 2013). https://doi.org/10.1117/1.OE.51.12.127202
JOURNAL ARTICLE
12 PAGES


SHARE
RELATED CONTENT

View interpolation and synthesis by KLT feature tracker
Proceedings of SPIE (January 17 2005)
UrbanScape
Proceedings of SPIE (May 01 2007)
Survey of image-based rendering techniques
Proceedings of SPIE (December 14 1998)

Back to Top