Optical blur can display significant spatial variation across the image plane, even for constant camera settings and object depth. Existing solutions to represent this spatially varying blur requires a dense sampling of blur kernels across the image, where each kernel is defined independent of the neighboring kernels. This approach requires a large amount of data collection, and the estimation of the kernels is not as robust as if it were possible to incorporate knowledge of the relationship between adjacent kernels.
A novel parameterized model is presented which relates the blur kernels at different locations across the image plane. The model is motivated by well-established optical models, including the Seidel aberration model. It is demonstrated that the proposed model can unify a set of hundreds of blur kernel observations across the image plane under a single 10-parameter model, and the accuracy of the model is demonstrated with simulations and measurement data collected by two separate research groups.
Proc. SPIE. 9029, Visual Information Processing and Communication V
KEYWORDS: Signal to noise ratio, Point spread functions, Cameras, Image processing, Image restoration, Interference (communication), Image sensors, Digital imaging, Digital image correlation, Device simulation
Established work in the literature has demonstrated that with accurate knowledge of the corresponding blur kernel (or point spread function, PSF), an unblurred prior image can be reliably estimated from one or more blurred observations. It has also been demonstrated, however, that an incorrect PSF specification leads to inaccurate image restoration. In this paper, we present a novel metric which relates the discrepancy between a known PSF and a choice of approximate PSF, and the resulting effect that this discrepancy will have on the reconstruction of an unblurred image. Such a metric is essential to the accurate development and application of a parameterized PSF model.<p> </p>Several error measures are proposed, which quantify the inaccuracy of image deblurring using a particular incorrect PSF. Using a set of simulation results, it is shown that the desired metric is feasible even without specification of the unblurred prior image or the radiometric response of the camera. It is also shown that the proposed metric accurately and reliably predicts the resulting deblurring error from the use of an approximate PSF in place of an exact PSF.
Optical blur due to lens aberrations and defocus has been demonstrated in some cases to be spatially varying across the image plane. However, existing models in the literature for the point-spread function (PSF) corresponding to this blur are either parameterized and spatially invariant or spatially varying but ad-hoc and discretely defined. A parameterized model is developed and presented for a spatially varying PSF due to lens aberrations and defocus in an imaging system. The model is motivated from an established theoretical framework in physics and is demonstrated to be able to unify a set of hundreds of PSF observations across the image plane into a single 10-parameter model. The accuracy of the model is demonstrated with simulations and measurement data collected by two separate research groups.
Contrary to common assumptions in the literature, the blur kernel corresponding to lens-effect blur has been
demonstrated to be spatially-varying across the image plane. Existing models for the corresponding point spread
function (PSF) are either parameterized and spatially-invariant, or spatially-varying but ad-hoc and discretely-defined.
In this paper, we develop and present a novel, spatially-varying, parameterized PSF model that accounts for
Seidel aberrations and defocus in an imaging system. We also demonstrate that the parameters of this model can
easily be determined from a set of discretely-defined PSF observations, and that the model accurately describes
the spatial variation of the PSF from a test camera.
Given a blurred image of a known test grid and an accurate estimate of the unblurred image, it has been demonstrated
that the underlying blur kernel (or point-spread function, PSF) can be reliably estimated. Unfortunately,
the estimate of the sharp image can be sensitive to common imperfections in the setup used to obtain the blurred
image, and errors in the image estimate result in an unreliable PSF estimate.
We propose a robust ad-hoc method to estimate a sharp prior image, given a blurry, noisy image of the test
grid from Joshi<sup>1</sup> taken in imperfect lab and lighting conditions. The proposed algorithm is able to reliably reject
superfluous image content, can deal with spatially-varying lighting, and is insensitive to errors in alignment of
the grid with the image plane.
We demonstrate the algorithms performance through simulation, and with a set of test images. We also show
that our grid registration algorithm leads to improved PSF estimation and deblurring, compared to an affine
registration using spatially invariant lighting correction.
Noncontact measurements of flexible and moving structures have the challenge of obtaining high speed, accurate
and registered data over a long range. Many noncontact measurement methods are based on the object staying
aligned with the sensor. Yet sometimes the desired loading is a result of the motion interacting with structural
dynamics as is the case with aeroelasticity. Triangulation of video data can capture large scale motion, but
limits the speed and accuracy of the measurement. Laser vibrometry can capture minute, structural vibrations
but must be aligned to the point of interest. This paper presents a method of registering a laser vibrometer
steering system to a motion capture system. The basis of calibration lies on determining the location of the laser
steering system through the videogrammetry capture volume for dynamic in-flight tracking and measurement.
A method for using video capture of the laser is presented to determine registered lines through the capture
volume. Results of the calibration are sufficient to have the laser track within half a degree for distances over
4m. The laser is then able to be open-loop steered with static and dynamic accuracies presented. This system
can provide real-time structural awareness enabling active control.