Translator Disclaimer
Paper
3 January 2020 Research on still image activity recognition based on deep learning
Author Affiliations +
Proceedings Volume 11373, Eleventh International Conference on Graphics and Image Processing (ICGIP 2019); 113730E (2020) https://doi.org/10.1117/12.2557257
Event: Eleventh International Conference on Graphics and Image Processing, 2019, Hangzhou, China
Abstract
The purpose of image activity recognition is the understanding of human visual behavior presented in images, and marking the identified activity picture with a pre-defined category label. Still Image activity recognition plays an important role in the field of image recognition. But now, the performance of activity recognition has not reached satisfactory results. In this paper, we construct a hybrid recognition model combined with Region-CNN (RCNN), AlexNet and SVM (or Random Forest model), which can effectively improve the performance compared with each single method. Firstly, the main object regions in an image are extracted using RCNN, then for each object region, the AlexNet is implemented to extract features, finally, all these object features are concatenated together and construct a SVM classifier (or Random Forest classifier) for still image activity recognition. The experimental results on still image dataset including 40 activity categories show that the hybrid model achieves better performance compared with single CNN model and other traditional methods. The AlexNet model only achieves an accuracy of 69.48%, the hybrid model of RCNN, AlexNet and SVM achieves an accuracy of 75.48%, and the hybrid model of RCNN, AlexNet and RF even reaches 78.15%, which verifies the effectiveness of our method.
© (2020) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Yue Guo, Jiale Yu, and Wei Wu "Research on still image activity recognition based on deep learning", Proc. SPIE 11373, Eleventh International Conference on Graphics and Image Processing (ICGIP 2019), 113730E (3 January 2020); https://doi.org/10.1117/12.2557257
PROCEEDINGS
8 PAGES


SHARE
Advertisement
Advertisement
RELATED CONTENT


Back to Top