2 May 2017 Utilization-based object recognition in confined spaces
Author Affiliations +
Abstract
Recognizing substantially occluded objects in confined spaces is a very challenging problem for ground-based persistent surveillance systems. In this paper, we discuss the ontology inference of occluded object recognition in the context of in-vehicle group activities (IVGA) and describe an approach that we refer to as utilization-based object recognition method. We examine the performance of three types of classifiers tailored for the recognition of objects with partial visibility, namely, (1) Hausdorff Distance classifier, (2) Hamming Network classifier, and (3) Recurrent Neural Network classifier. In order to train these classifiers, we have generated multiple imagery datasets containing a mixture of common objects appearing inside a vehicle with full or partial visibility and occultation. To generate dynamic interactions between multiple people, we model the IVGA scenarios using a virtual simulation environment, in which a number of simulated actors perform a variety of IVGA tasks independently or jointly. This virtual simulation engine produces the much needed imagery datasets for the verification and validation of the efficiency and effectiveness of the selected object recognizers. Finally, we improve the performance of these object recognizers by incorporating human gestural information that differentiates various object utilization or handling methods through the analyses of dynamic human-object interactions (HOI), human-human interactions (HHI), and human-vehicle interactions (HVI) in the context of IVGA.
Conference Presentation
© (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Amir Shirkhodaie, Amir Shirkhodaie, Durga Telagamsetti, Durga Telagamsetti, Alex L. Chan, Alex L. Chan, } "Utilization-based object recognition in confined spaces", Proc. SPIE 10200, Signal Processing, Sensor/Information Fusion, and Target Recognition XXVI, 1020013 (2 May 2017); doi: 10.1117/12.2266223; https://doi.org/10.1117/12.2266223
PROCEEDINGS
12 PAGES + PRESENTATION

SHARE
Back to Top