Open Access
1 May 2010 Novel no-reference image blur metric based on block-based discrete cosine transform statistics
Yu Han, Xiaoming Xu, Yunze Cai
Author Affiliations +
Abstract
A new algorithm for no-reference blurred image quality assessment is proposed. This metric is based on block-based discrete cosine transform statistics and linear prediction method. We compare the performance of the proposed method with four blur metrics on three famous databases and our work shows the best results.

1.

Introduction

With the popularization of digital cameras, the demand for object image quality assessment algorithms has risen. As a way to choose a best image for final applications, object image quality assessment algorithms play an important role in image engineering systems. Since ideal reference images usually cannot be found in practice, the assessment problem becomes no-reference (NR) image quality assessment, which assumes that the true scene of a distorted image is unknown.

Blur is the most common type for quality degradation in imaging systems and its main cause is due to the focus variation or position motion of the camera. Blur is usually modeled by a smoothing of the high frequency components of Fourier coefficients in spectrum space. Several methods were proposed for blurred image metric. In Ref. 1, the authors exploited the principle that high frequency coefficients of blurred images tend to zero, and proposed a quality evaluation algorithm by cumulating the coefficient distribution of images after the discrete cosine transform (DCT). Since the central diagonal of the DCT coefficient matrix can efficiently characterize global blur, the quality measure was obtained by counting numbers from a weighting matrix, which gives more importance to the diagonal. We mark this method as the DCT metric (DCTM). In Ref. 2, a perceptual blur no-reference metric based on edge length was launched. This work first proposed the conception of edge width realized by computing the distance from the start to the end positions of the Sobel edge. The global blur measure was obtained by averaging all edge widths. We denote this method as the edge width metric (EWM). In Ref. 3, the authors proposed an algorithm that utilized human visual system (HVS) features to improve metric performance. In this method, the image was first divided into blocks of 8×8 and marked based on their edge count. Then, the average edge length for each block was computed and weighted based on the contrast of the block. The final blur measure was realized by the weighted average edge length. We mark this metric as the HVS edge width metric (HVSEWM). In Ref. 4, the authors proposed an algorithm based on local phase coherence. The metric utilized the local phase coherence characteristics, and constructed an iterative algorithm that separates bands into coherent wavelet coefficients and incoherent coefficients. By calculating the mean of standard deviations of incoherent coefficients in each band, the metric was founded. We symbol this local phase coherence metric LPCM.

In this work, based on the blur theory and block-based DCT statistics in Refs. 5, 6, we propose a novel no-reference objective metric for blurred image assessment, and evaluate its performance against four quality evaluation metrics on three public databases.

2.

Blur Metric Based on Block-Based Discrete Cosine Transform Statistics

According to Ref. 6, DCT coefficient data distribution of natural images is well modeled by a Laplace distribution in certain blocks. Using 8×8 blocks, for each frequency pair (i,j){0,,7}×{0,,7} and (i,j)(0,0) , the coefficient’s distribution is thus modeled by

Eq. 1

fX(x)=λ(i,j)2exp[λ(i,j)|x|],
where λ(i,j) is the feature parameter of distribution for frequency pair (i,j) , and x is the coefficient value. The estimate for λ is generally computed by using the maximum likelihood (ML) method on original coefficient data. To a given frequency, an ML estimate result for λ is given by

Eq. 2

λML=Nk=1N|xk|=1E(|x|),
where N represents the number of DCT blocks, xk stands for DCT coefficients at that frequency, and E[] represents the expected value.

According to the image degradation theory, blurred images can be created by directly multiplying clear images with certain blur point spread functions (PSFs) in spectrum space. The classic blur PSFs were thoroughly analyzed in Ref. 5, including motion, out of focus, and Gaussian PSF. The curve shapes of these PSFs are similar in spectrum space: they attain the maximum value at the center frequency (0,0), decrease dramatically near the center frequency, and maintain lower expected values with small fluctuations along with frequencies increasing. And the blur extent is mainly determined by how violently the blur PSF decreases near the center frequency. With blur PSF working on an image, the Fourier coefficients of the blurred image at center frequency will have a big descent based on blur PSF discussed before. Since spectrum values are symmetry, the expected coefficient value E(|x|) varies like a step function jumping from large to small with increasing (i,j) . Then λ , the inverse of E(|x|) , also varies like a step function while jumping from small to large with increasing (i,j) . The jump position and gradient of step function determines the blur extent. This phenomenon can also be testified by viewing the λ distribution map of one image with a different blur radius.

To model the feature of this step function well, we use a logistic function in 2-D polar coordinates to simulate λ distribution in the frequency domain.

Eq. 3

λ(ρ)=p1[1+exp(p2ρp3)].

In Eq. 3, ρ=sqrt(i2+j2) , and p1 , p2 , and p3 are parameters that need to be estimated. And image quality can be determined by p1 , p2 , and p3 .

Eq. 4

Q=f(p1,p2,p3),
where Q stands for image quality, and f is a function only determined by p1 , p2 , and p3 . Since the nonlinear estimation of p1 , p2 , and p3 cause overwhelming burdens of computation and usually generate large errors, here we launch a fast algorithm. Consider that the formula can be reformed as
p3p2ρ+p1=1+λ(i,j)+log[λ(i,j)],
where the approximation formula log(1+x)=x is used in the reforming process. Thus, we believe p1 , p2 , and p3 are linearly or polynomial-linearly correlated with λ(i,j)+log[λ(i,j)] : i,j=0,1,2, . Then as a result, the function f(p1,p2,p3) could be approximated by λ(i,j)+log[λ(i,j)] .

Eq. 5

Q=f(p1,p2,p3)=g(0,0)+i,jg(i,j)[λ(i,j)+logλ(i,j)],
where g(i,j) are scale coefficients. In fact, g(i,j) can be determined by the least mean square (LMS) method on certain images with known quality. With all of g(i,j) known, our blurred image quality assessment algorithm is certain. For a given blurred image, its quality can be calculated by the following algorithm.
  • 1. Cut image into 8×8 blocks and exert DCT on each block.

  • 2. Count coefficients in each (i,j) and estimate λ(i,j) by ML criteria in Eq. 2.

  • 3. Let λ(0,0)=1 , Q=0 , for each pair (i,j) , Q=Q+g(i,j)[λ(i,j)+logλ(i,j)] .

The final Q is its quality. We call this method the DCT statistic prediction method (DCTSP).

3.

Experiment and Results

To ascertain the coefficients g(i,j) in DCTSP that we proposed, we calculate coefficients g(i,j) of DCTSP on the LIVE database from University of Texas.7 The values of G calculated by least mean square (LMS) criteria are shown in Table 1.

Table 1

The value of g(i,j) calculated by the LMS method on the LIVE database.

j i
01234567
0 0.034 0.658 1.0001.499 0.092 0.653 0.175 0.909
11.7550.342 0.341 0.516 0.224 0.016 0.151 0.327
2 1.556 1.206 0.323 1.329 1.5920.1670.0370.635
32.1450.471 0.379 0.229 0.270 0.504 0.030 0.183
40.4430.859 0.492 1.101 0.569 0.413 0.174 0.180
5 1.601 0.4330.2160.998 0.434 0.558 0.269 0.026
6 0.181 0.113 0.868 0.873 1.179 0.066 0.750 0.562
70.1840.4530.051 0.901 1.868 1.208 0.078 0.740

Since DCTSP was determined by optimizing on the LIVE database, to assess its performance fairly, DCTSP was also applied to other databases. We chose the CSIQ8 database at Oklahoma State University and the TID20089 database. There are 145, 150, and 100 blurred images in LIVE, CSIQ, and TID2008 databases, respectively. Here, we use a five parameter logistic function to predict subject evaluation. To evaluate objectively the predictive performance of our metric, four indicators are computed: corre-lation coefficient (CC), root mean squared error (RMSE), Spearman rank-order correlation coefficient (SROCC), and outlier ratio (OR), while the definition of these indicators can be found in Ref. 10. The larger CC and SROCC are, the smaller RMSE and OR are, and the better the metric’s performance is.

We compare the proposed method with metrics talked about in Sec. 1. Table 2 showed the performance of these no-reference blur measures, including DCTSP on the LIVE, CSIQ, and TID2008 databases. From Table 2, DCTSP11 shows the best predictive performances against other blur measures, especially on correlation coefficients (CC). Although coefficients of DCTSP were determined from the LIVE database, it showed better generalization on other databases.

Table 2

Performance comparison of different image quality assessment methods on LIVE, CSIQ, and TID2008. Note that OR cannot be calculated in TID2008, since standard deviation was not provided.

CCRMSESROCCOR
LIVE
DCTM0.87127.84740.85400.5931
EWM0.79289.74010.77970.7517
HVSEWM0.85588.26900.86250.6207
LPCM0.80749.42880.81160.6828
DCTSP0.95604.68720.95400.3172
CSIQ
DCTM0.87910.13660.85480.3067
EWM0.76790.18690.75530.3867
HVSEWM0.85340.14940.81570.3067
LPCM0.85110.15050.83960.3200
DCTSP0.94710.09190.91750.2267
TID2008
DCTM0.73990.78940.7401
EWM0.71270.82320.6919
HVSEWM0.75690.76690.7584
LPCM0.64310.89860.6825
DCTSP0.94440.38590.9418

4.

Conclusion

In this work, we propose a no-reference quality assessment metric, DCTSP, for blurred images. This metric is based on block-based DCT statistics and blur theory. Results show that DCTSP exhibits superior performance against other quality assessment methods.

Acknowledgments

This work was supported by Shanghai Key Fundamental Research Program (09JC1408000).

References

1. 

X. Marichal, W. Y. Ma, and H. Zhang, “Blur determination in the compressed domain using DCT information,” Proc. Intl. Conf. Image Process. (ICIP’99), 2 286 –390 (1999). Google Scholar

2. 

P. Marziliano, F. Dufaux, S. Winkler, and T. Ebrahimi, “Perceptual blur and ringing metrics: application to JPEG2000,” Signal Process. Image Commun., 19 (2), 163 –172 (2004). https://doi.org/10.1016/j.image.2003.08.003 0923-5965 Google Scholar

3. 

R. Ferzli and L. J. Karam, “A human visual system based no-reference objective image sharpness metric,” Intl. Conf. Image Process. (ICIP), 2949 –2952 (2006) Google Scholar

4. 

A. Ciancio, A. L. N. T. da Costa, E. A. B. da Silva, A. Said, R. Samadani, and P. Obrador, “Objective no-reference image blur metric based on local phase coherence,” Electron. Lett., 45 (23), 1162 –1163 (2009). https://doi.org/10.1049/el.2009.1800 0013-5194 Google Scholar

5. 

G. Pavlovic and A. M. Tekalp, “Maximum likelihood parametric based on a continuous spatial blur identification domain model,” IEEE Trans. Image Process., 1 (4), 496 –504 (1992). https://doi.org/10.1109/83.199919 1057-7149 Google Scholar

6. 

E. Lam and J. Goodman, “A mathematical analysis of the DCT coefficient distributions for images,” IEEE Trans. Image Process., 9 (10), 1661 –1666 (2000). https://doi.org/10.1109/83.869177 1057-7149 Google Scholar

7. 

H. R. Sheikh, Z. Wang, L. Cormack, and A. C. Bovik, “Live image quality assessment database release 2,” (2005) http://live.ece.utexas.edu/research/quality Google Scholar

8. 

Image Coding and Analysis Laboratory, Oklahoma State Univ., Categorical Subjective Image Quality, see http://vision.okstate.edu/csiq/ Google Scholar

9. 

N. Ponomarenko, F. Battisti, K. Egiazarian, J. Astola, and V. Lukin, “Metrics performance comparison for color image database,” 4th Intl. Workshop Video Process. Quality Metrics Consumer Electron., 6 (2009) Google Scholar

10. 

A. M. Rohaly, P. J. Corriveau, J. M. Libert, A. A. Webster, and V. Baroncini, “Video Quality Experts Group: current results and future directions,” Proc. SPIE, 4067 742 –753 (2000). https://doi.org/10.1117/12.386632 0277-786X Google Scholar

11. 

Y. Han, “A novel no-reference image blur metric based on block-based DCT statistics,” http://hansy.weebly.com/image-quality-assessmentnr.html Google Scholar
©(2010) Society of Photo-Optical Instrumentation Engineers (SPIE)
Yu Han, Xiaoming Xu, and Yunze Cai "Novel no-reference image blur metric based on block-based discrete cosine transform statistics," Optical Engineering 49(5), 050501 (1 May 2010). https://doi.org/10.1117/1.3420235
Published: 1 May 2010
Lens.org Logo
CITATIONS
Cited by 6 scholarly publications and 1 patent.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image quality

Databases

Point spread functions

Data modeling

Optical engineering

Cameras

Digital cameras

Back to Top