14 June 2017 Combining point context and dynamic time warping for online gesture recognition
Author Affiliations +
Abstract
Previous gesture recognition methods usually focused on recognizing gestures after the entire gesture sequences were obtained. However, in many practical applications, a system has to identify gestures before they end to give instant feedback. We present an online gesture recognition approach that can realize early recognition of unfinished gestures with low latency. First, a curvature buffer-based point context (CBPC) descriptor is proposed to extract the shape feature of a gesture trajectory. The CBPC descriptor is a complete descriptor with a simple computation, and thus has its superiority in online scenarios. Then, we introduce an online windowed dynamic time warping algorithm to realize online matching between the ongoing gesture and the template gestures. In the algorithm, computational complexity is effectively decreased by adding a sliding window to the accumulative distance matrix. Lastly, the experiments are conducted on the Australian sign language data set and the Kinect hand gesture (KHG) data set. Results show that the proposed method outperforms other state-of-the-art methods especially when gesture information is incomplete.
© 2017 SPIE and IS&T
Xia Mao, Xia Mao, Chen Li, Chen Li, } "Combining point context and dynamic time warping for online gesture recognition," Journal of Electronic Imaging 26(3), 033023 (14 June 2017). https://doi.org/10.1117/1.JEI.26.3.033023 . Submission: Received: 11 January 2017; Accepted: 23 May 2017
Received: 11 January 2017; Accepted: 23 May 2017; Published: 14 June 2017
JOURNAL ARTICLE
9 PAGES


SHARE
Back to Top