From Event: SPIE Commercial + Scientific Sensing and Imaging, 2017
Light-field content is required to provide full-parallax 3D view with dense angular resolution. However, it is very hard to directly capture such dense full-parallax view images using a camera system because it requires specialised micro-lens arrays or a heavy camera-array system. Therefore, we present an algorithm to synthesise full-parallax virtual view images using image-based rendering appropriate for light-field content generation. The proposed algorithm consists of four-directional image warping, view image blending using the nearest view image priority selection and the sum of the weighted inverse Euclidean distance, and hole filling. Experimental results show that dense full-parallax virtual view images can be generated from sparse full-parallax view images with fewer image artefacts. Finally, it is confirmed that the proposed full-parallax view synthesis algorithm can be used for light-field content generation without a dense camera array system.
Youngsoo Park, Hong-chang Shin, Gwangsoon Lee, Won-sik Cheong, and Namho Hur, "Full-parallax virtual view image synthesis using image-based rendering for light-field content generation," Proc. SPIE 10219, Three-Dimensional Imaging, Visualization, and Display 2017, 102190E (Presented at SPIE Commercial + Scientific Sensing and Imaging: April 11, 2017; Published: 10 May 2017); https://doi.org/10.1117/12.2264596.
Conference Presentations are recordings of oral presentations given at SPIE conferences and published as part of the conference proceedings. They include the speaker's narration along with a video recording of the presentation slides and animations. Many conference presentations also include full-text papers. Search and browse our growing collection of more than 14,000 conference presentations, including many plenary and keynote presentations.