23 May 2002 Virtual view generation of natural panorama scenes by setting representation
Author Affiliations +
This paper proposes a technique to generate virtual views of a natural panorama scene. The scene is captured by an original 3-camera system. The images are stitched into a stereo panorama and the depth is estimated. The texture panorama is segmented into regions, each of which can be regarded to be approximated as a plane. The planar parameter set of the region for setting representation is calculated depending on the depth data. According to the representation the virtual views are generated using center panorama texture, and left and right panoramas are used for occlusion compensation.
© (2002) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Kunio Yamada, Kunio Yamada, Kenji Mochizuki, Kenji Mochizuki, Takeshi Naemura, Takeshi Naemura, Kiyoharu Aizawa, Kiyoharu Aizawa, Takahiro Saito, Takahiro Saito, "Virtual view generation of natural panorama scenes by setting representation", Proc. SPIE 4660, Stereoscopic Displays and Virtual Reality Systems IX, (23 May 2002); doi: 10.1117/12.468043; https://doi.org/10.1117/12.468043


Back to Top