29 August 2016 Paralleled Laplacian of Gaussian (LoG) edge detection algorithm by using GPU
Author Affiliations +
Proceedings Volume 10033, Eighth International Conference on Digital Image Processing (ICDIP 2016); 1003309 (2016) https://doi.org/10.1117/12.2244599
Event: Eighth International Conference on Digital Image Processing (ICDIP 2016), 2016, Chengu, China
Abstract
Laplacian of Gaussian (LoG) filter is a very conventional and effective edge detector which is used in edge detection. In the image denoising phase, we implemented the parallel method of Gaussian blur to the image so that we can get rid of the impact brought by the original image, and prevent the noise being amplified by Laplace operator. Then, in the phase of edge detection, the Laplace operator was applied to the result which has been processed and exported by the first phase. Through the optimization of these steps, the running performance will make a big difference compared to the pure Laplacian. Combining with the highly evolved Graphics Processing Unit (GPU), the way of parallel image processing will be more effective than the serial one. In this study, the parallel LoG Algorithm was implemented in different size images on NVIDIA GPU using Compute Unified Device Architecture (CUDA). By applying the parallel LoG algorithm suggested here, the time required to the process of edge detection would be narrowed down immensely and achieve speed up by factor 3.7x compared to the serial application running on CPU.
© (2016) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Weibin Wu, "Paralleled Laplacian of Gaussian (LoG) edge detection algorithm by using GPU", Proc. SPIE 10033, Eighth International Conference on Digital Image Processing (ICDIP 2016), 1003309 (29 August 2016); doi: 10.1117/12.2244599; https://doi.org/10.1117/12.2244599
PROCEEDINGS
5 PAGES


SHARE
Back to Top