Translator Disclaimer
9 February 2019 Texture-preserving denoising method for the removal of random-valued impulse noise in gray-scale images
Author Affiliations +
Image denoising has become one of the fundamental problems in the field of image processing. We present an iterative image denoising method named as multitexton noise identification and local dissimilarity-based noise removal that incorporates a noise restoration phase followed by a noise identification phase for gray-scale images corrupted by random-valued impulse noise (RVIN). Multiple textons of distinct orientations arranged on the basis of radial symmetry are proposed that give sharp edges of restored images. The noise identification phase works on an adaptive threshold range by employing the local statistics of textons. Moreover, the proposed method can better be utilized to deal with the uncertainty observed in identification of noisy and edge pixels. The dissimilarity of the identified corrupted pixel with its four-connected neighboring pixels is computed by considering the similarity among these four-connected pixels, to restore the gray-level intensities of identified corrupted pixels at the second phase. Standard gray-scale benchmark test images are used to assess the quantitative and visual performance of the proposed method by comparing it with state-of-the-art denoising methods. The extensive experimental results reveal that the presented method outperforms in terms of peak-signal-to-noise ratio, structural similarity index measurement, miss detection, false detection, and visual results for both lower and higher intensities of RVIN as compared to other existing techniques.
© 2019 Society of Photo-Optical Instrumentation Engineers (SPIE) 0091-3286/2019/$25.00 © 2019 SPIE
Hassan Dawood, Munsif Iqbal, Marium Azhar, Haseeb Ahmad, Hussain Dawood, Zahid Mehmood, and Jalal S. Alowibdi "Texture-preserving denoising method for the removal of random-valued impulse noise in gray-scale images," Optical Engineering 58(2), 023103 (9 February 2019).
Received: 26 October 2018; Accepted: 23 January 2019; Published: 9 February 2019

Back to Top