Robust hand detection and classification is one of the most essential tasks in sign language recognition. However, the problem is very challenging due to the complexity of hands in sign language. The performance of existing approaches can be easily affected by the numerous variations of sign language gestures, small and unobtrusive hand areas, and ever changing of hand locations. In this paper, to detect and classify the hands in sign language robustly, such kind of small objects that contain rich information, we propose an improved Faster R-CNN approach, namely Multi-scale Faster RCNN. Our approach extends the framework of the Faster R-CNN and a multi-scale strategy is adopted to incorporate hierarchical convolution feature maps. We evaluate our approach on the self-built sign language dataset and the experimental results demonstrate the effectiveness of our proposed approach.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.