27 September 2016 Capturing the plenoptic function in a swipe
Author Affiliations +
Abstract
Blur in images, caused by camera motion, is typically thought of as a problem. The approach described in this paper shows instead that it is possible to use the blur caused by the integration of light rays at different positions along a moving camera trajectory to extract information about the light rays present within the scene. Retrieving the light rays of a scene from different viewpoints is equivalent to retrieving the plenoptic function of the scene. In this paper, we focus on a specific case in which the blurred image of a scene, containing a flat plane with a texture signal that is a sum of sine waves, is analysed to recreate the plenoptic function. The image is captured by a single lens camera with shutter open, moving in a straight line between two points, resulting in a swiped image. It is shown that finite rate of innovation sampling theory can be used to recover the scene geometry and therefore the epipolar plane image from the single swiped image. This epipolar plane image can be used to generate unblurred images for a given camera location.
Conference Presentation
© (2016) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Michael Lawson, Mike Brookes, Pier Luigi Dragotti, "Capturing the plenoptic function in a swipe", Proc. SPIE 9971, Applications of Digital Image Processing XXXIX, 99710O (27 September 2016); doi: 10.1117/12.2236981; https://doi.org/10.1117/12.2236981
PROCEEDINGS
8 PAGES + PRESENTATION

SHARE
RELATED CONTENT


Back to Top