Paper
23 June 2003 Real-time segmentation of video objects for mixed-reality interactive applications
Author Affiliations +
Proceedings Volume 5150, Visual Communications and Image Processing 2003; (2003) https://doi.org/10.1117/12.509861
Event: Visual Communications and Image Processing 2003, 2003, Lugano, Switzerland
Abstract
The present paper introduces a very specific and pragmatic approach to segmentation. It is driven by a particular application context: in the framework of mixed-reality, Tranfiction (“transportation into fictional spaces”) is designed to mix synthetic and natural images in real time while allowing users to interact in these input/output screens. Segmentation is therefore used to provide both the immersion and interaction capabilities. The former aspect is achieved by composing the image of the user within the projected virtual scenes, while the later is achieved thanks to basic body/gesture analysis on the segmented silhouettes. According to indoor or outdoor usages, two real-time techniques are developed. Results are analyzed with respect to the overall application, not only in terms of absolute quality but also in terms of perception by the users.
© (2003) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Xavier Marichal and Toshiyuki Umeda "Real-time segmentation of video objects for mixed-reality interactive applications", Proc. SPIE 5150, Visual Communications and Image Processing 2003, (23 June 2003); https://doi.org/10.1117/12.509861
Lens.org Logo
CITATIONS
Cited by 20 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image segmentation

Video

Cameras

Detection and tracking algorithms

Digital filtering

Visualization

Analytical research

Back to Top