Translator Disclaimer
Paper
21 May 2004 Real-time capturing and interactive synthesis of 3D scenes using integral photography
Author Affiliations +
Proceedings Volume 5291, Stereoscopic Displays and Virtual Reality Systems XI; (2004) https://doi.org/10.1117/12.529808
Event: Electronic Imaging 2004, 2004, San Jose, California, United States
Abstract
This paper proposes a system which can capture a dynamic 3D scene and synthesize its arbitrary views in real time. Our system consists of four components: a fresnel lens, a micro-lens array, an IEEE1394 digital camera, and a PC for rendering purpose. The micro-lens array forms an image which consists of a set of elemental images, in other words, multiple viewpoint images of the scene. The fresnel lens controls the depth of field by demagnifying the 3D scene. The problem is that the scene demagnified by the fresnel lens is compressed along its optical axis. Therefore, we propose a method for recovering the original scene from the compressed scene. The IEEE1394 digital camera captures multiple viewpoint images at 15 frames per second, and transfers these images to the PC. The PC synthesizes any perspective of the captured scene from the multiple viewpoint images using image-based rendering techniques. The proposed system synthesizes one perspective of the captured scene within 1/15 second. This means that a user can interactively move his/her viewpoint and observe even a moving object from various directions.
© (2004) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Tomoyuki Yamamoto and Takeshi Naemura "Real-time capturing and interactive synthesis of 3D scenes using integral photography", Proc. SPIE 5291, Stereoscopic Displays and Virtual Reality Systems XI, (21 May 2004); https://doi.org/10.1117/12.529808
PROCEEDINGS
12 PAGES


SHARE
Advertisement
Advertisement
Back to Top