Translator Disclaimer
Presentation + Paper
10 May 2017 Full-parallax virtual view image synthesis using image-based rendering for light-field content generation
Author Affiliations +
Light-field content is required to provide full-parallax 3D view with dense angular resolution. However, it is very hard to directly capture such dense full-parallax view images using a camera system because it requires specialised micro-lens arrays or a heavy camera-array system. Therefore, we present an algorithm to synthesise full-parallax virtual view images using image-based rendering appropriate for light-field content generation. The proposed algorithm consists of four-directional image warping, view image blending using the nearest view image priority selection and the sum of the weighted inverse Euclidean distance, and hole filling. Experimental results show that dense full-parallax virtual view images can be generated from sparse full-parallax view images with fewer image artefacts. Finally, it is confirmed that the proposed full-parallax view synthesis algorithm can be used for light-field content generation without a dense camera array system.
Conference Presentation
© (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Youngsoo Park, Hong-chang Shin, Gwangsoon Lee, Won-sik Cheong, and Namho Hur "Full-parallax virtual view image synthesis using image-based rendering for light-field content generation", Proc. SPIE 10219, Three-Dimensional Imaging, Visualization, and Display 2017, 102190E (10 May 2017);

Back to Top