27 October 2013 Action recognition by mid-level discriminative spatial-temporal volume
Author Affiliations +
Proceedings Volume 8919, MIPPR 2013: Pattern Recognition and Computer Vision; 89190H (2013) https://doi.org/10.1117/12.2031129
Event: Eighth International Symposium on Multispectral Image Processing and Pattern Recognition, 2013, Wuhan, China
Abstract
Most of recent work on action recognition in video employ action parts, attributes etc. as mid- and high-level features to represent an action. However, these action parts, attributes subject to some aspects of weak discrimination and being difficult to obtain. In this paper, we present an approach that uses mid-level discriminative Spatial-Temporal Volume to recognize human actions. The spatial-temporal volume is represented by a Feature Graph which is constructed beyond on a local collection of feature points (e.g., cuboids, STIP) located in the corresponding spatial-temporal volume. Firstly, we densely sampling spatial-temporal volumes from training videos and construct a feature graph for each volume. Then, all feature graphs are clustered using spectral cluster method. We regard feature graphs as video words and characterize videos with the bag-of-features framework which we call it the bag-of-feature-graphs framework. While, in the process of clustering, the distance between two feature graphs is computed using an efficient spectral method. Final recognition is accomplished using a linear-SVM classifier. We test our algorithm in a publicly available human action dataset, the experimental results show the effectiveness of our method.
© (2013) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Feifei Chen, Nong Sang, "Action recognition by mid-level discriminative spatial-temporal volume", Proc. SPIE 8919, MIPPR 2013: Pattern Recognition and Computer Vision, 89190H (27 October 2013); doi: 10.1117/12.2031129; https://doi.org/10.1117/12.2031129
PROCEEDINGS
6 PAGES


SHARE
Back to Top