13 April 2009 SAM: an interoperable metadata model for multimodal surveillance applications
Author Affiliations +
Abstract
Metadata interoperability is crucial for various kinds of surveillance applications and systems, e.g. metadata mining in multi-sensor environments, metadata exchange in networked camera systems or information fusion in multi-sensor and multi-detector environments. Different metadata formats have been proposed to foster metadata interoperability, but they show significant limitations. ViPER, CVML and MPEG Visual Surveillance MAF support only the visual modality, CVML's frame based approach leads to inefficient representation, and MPEG-7's comprehensiveness handicaps its efficient usage for a specific application. To overcome these limitations we propose the Surveillance Application Metadata (SAM) model, capable of describing online and offline analysis results as a set of time lines containing events. A set of sensors, detectors, recorded media items and object instances is described centrally and linked from the event descriptions. The time lines can be related to a subset of sensors and detectors for any modality and different levels of abstraction. Hierarchical classification schemes are used for many purposes, such as types of properties and their values, event types, object classes, coordinate systems etc. in order to allow for application specific adaptations without modifying the data model while ensuring the controlled use of terms. The model supports efficient representation of dense spatio-temporal information such as object trajectories. SAM is not bound to a specific serialization but can be mapped to different existing formats within the limitations evoked by the target format. SAM specifications and examples have been made available
© (2009) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Peter Schallauer, Peter Schallauer, Werner Bailer, Werner Bailer, Albert Hofmann, Albert Hofmann, Roland Mörzinger, Roland Mörzinger, } "SAM: an interoperable metadata model for multimodal surveillance applications", Proc. SPIE 7344, Data Mining, Intrusion Detection, Information Security and Assurance, and Data Networks Security 2009, 73440C (13 April 2009); doi: 10.1117/12.818481; https://doi.org/10.1117/12.818481
PROCEEDINGS
11 PAGES


SHARE
RELATED CONTENT

Recent developments in automated lip-reading
Proceedings of SPIE (October 15 2013)
Low-Cost Instant Surveillance System
Proceedings of SPIE (June 22 1983)
Wide area persistent surveillance with no gimbal
Proceedings of SPIE (May 03 2012)
Tracking people in mixed modality systems
Proceedings of SPIE (January 28 2007)

Back to Top