Presentation + Paper
12 March 2024 Using autofocus data for 3D hand interaction
Chan-Yuan Huang, Chen-Han Lin, Homer H. Chen
Author Affiliations +
Abstract
Hand tracking algorithms relying on a single camera as the sensing device can only provide relative depth information, resulting in limited practicality. This limitation underscores the necessity for effective and accurate estimation of the absolute distances between hand joints and the camera in the real world. We respond to this pressing need by introducing a methodology that exploits the autofocus functionality of a camera for hand tracking. It takes advantage of the unutilized potential of a camera and removes the need for additional power-demanding and costly depth sensors to accurately estimate the absolute distances of hand joints. Our methodology undergoes rigorous experimental validation and consistently outperforms traditional methods across different lens positions.
Conference Presentation
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Chan-Yuan Huang, Chen-Han Lin, and Homer H. Chen "Using autofocus data for 3D hand interaction", Proc. SPIE 12913, Optical Architectures for Displays and Sensing in Augmented, Virtual, and Mixed Reality (AR, VR, MR) V, 129130L (12 March 2024); https://doi.org/10.1117/12.3000801
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Cameras

Distance measurement

Detection and tracking algorithms

3D tracking

Gesture recognition

Calibration

Virtual reality

RELATED CONTENT


Back to Top