19 October 2016 Motion blur estimation based on multitarget matching model
Author Affiliations +
Optical Engineering, 55(10), 100502 (2016). doi:10.1117/1.OE.55.10.100502
We propose a new method to estimate motion blur parameters based on the autocorrelation function of a blurred image. This blurred image is considered as a superposition of M shifted images identical to the original nonblurred image. In this case, convolution of the blurred image with itself can be considered as M2 pairwise convolutions, which contribute to the resultant autocorrelation function producing a distinguishable line corresponding to the estimated motion blur angle. The proposed method demonstrates the comparable accuracy of the motion blur angle estimation in comparison with state-of-the-art methods. Our method possesses lower computational complexity than popular accurate methods based on Radon transform. The proposed model also allows to accurately estimate motion blur length. Our results of length estimation, in general, outperform the accuracy of the methods based on Radon transform.
© 2016 Society of Photo-Optical Instrumentation Engineers (SPIE)
Victor Karnaukhov, Mikhail Mozerov, "Motion blur estimation based on multitarget matching model," Optical Engineering 55(10), 100502 (19 October 2016). https://doi.org/10.1117/1.OE.55.10.100502

Motion estimation

Motion models

Radon transform



Error analysis

Signal to noise ratio

Back to Top