10 January 2018 Color image definition evaluation method based on deep learning method
Author Affiliations +
Abstract
In order to evaluate different blurring levels of color image and improve the method of image definition evaluation, this paper proposed a method based on the depth learning framework and BP neural network classification model, and presents a non-reference color image clarity evaluation method. Firstly, using VGG16 net as the feature extractor to extract 4,096 dimensions features of the images, then the extracted features and labeled images are employed in BP neural network to train. And finally achieve the color image definition evaluation. The method in this paper are experimented by using images from the CSIQ database. The images are blurred at different levels. There are 4,000 images after the processing. Dividing the 4,000 images into three categories, each category represents a blur level. 300 out of 400 high-dimensional features are trained in VGG16 net and BP neural network, and the rest of 100 samples are tested. The experimental results show that the method can take full advantage of the learning and characterization capability of deep learning. Referring to the current shortcomings of the major existing image clarity evaluation methods, which manually design and extract features. The method in this paper can extract the images features automatically, and has got excellent image quality classification accuracy for the test data set. The accuracy rate is 96%. Moreover, the predicted quality levels of original color images are similar to the perception of the human visual system.
© (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Di Liu, Di Liu, YingChun Li, YingChun Li, } "Color image definition evaluation method based on deep learning method", Proc. SPIE 10616, 2017 International Conference on Optical Instruments and Technology: Optical Systems and Modern Optoelectronic Instruments, 106160V (10 January 2018); doi: 10.1117/12.2289589; https://doi.org/10.1117/12.2289589
PROCEEDINGS
6 PAGES


SHARE
Back to Top