Laplacian of Gaussian (LoG) filter is a very conventional and effective edge detector which is used in edge detection. In the image denoising phase, we implemented the parallel method of Gaussian blur to the image so that we can get rid of the impact brought by the original image, and prevent the noise being amplified by Laplace operator. Then, in the phase of edge detection, the Laplace operator was applied to the result which has been processed and exported by the first phase. Through the optimization of these steps, the running performance will make a big difference compared to the pure Laplacian. Combining with the highly evolved Graphics Processing Unit (GPU), the way of parallel image processing will be more effective than the serial one. In this study, the parallel LoG Algorithm was implemented in different size images on NVIDIA GPU using Compute Unified Device Architecture (CUDA). By applying the parallel LoG algorithm suggested here, the time required to the process of edge detection would be narrowed down immensely and achieve speed up by factor 3.7x compared to the serial application running on CPU.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.