Paper
15 November 2018 GPU based real-time enhancement of high resolution image
Author Affiliations +
Proceedings Volume 10964, Tenth International Conference on Information Optics and Photonics; 1096440 (2018) https://doi.org/10.1117/12.2506026
Event: Tenth International Conference on Information Optics and Photonics (CIOP 2018), 2018, Beijing, China
Abstract
Curvature filter and gradient transform based image enhancement algorithm can effectively suppress noises and enhance image edges. However, it is very hard to be carried out in real time due to the large computing load. To address this problem, a GPU based parallel implementation is proposed in this paper. First, aiming at the characteristics of the algorithm, a numerical implementation method based on central-difference is proposed. Then a domain decomposition scheme is utilized in parallel Gaussian curvature filter to remove the dependence of neighboring pixels and guarantee convergence. Finally, we make the multiprocessor wrap occupancy reach 100% by optimizing the thread grid and register usage. Experimental results demonstrate that our parallel method runs 200-300 times faster than CPU serial method with real time processing of 4096×4096 resolution image, which indicates a great potential for application.
© (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Maosen Huang, Kuanhong Cheng, Huixin Zhou, Yue Yu, Zhe Zhang, and Wei Tan "GPU based real-time enhancement of high resolution image", Proc. SPIE 10964, Tenth International Conference on Information Optics and Photonics, 1096440 (15 November 2018); https://doi.org/10.1117/12.2506026
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image enhancement

Image processing

Gaussian filters

Image filtering

Reconstruction algorithms

Image resolution

Graphics processing units

Back to Top