To achieve high perceptual quality of compressed images, many objective image quality metrics for compression artifacts evaluation and reduction have been developed based on characterization of local image features. However, it is the end user who is judging the image quality in various applications, so the validation of how well these metrics predict human perception is important and necessary. In this paper, we present a preliminary psychophysics experiment method to capture human perception of local ringing artifacts in JPEG images with different severity levels. Observers are asked to annotate the compressed image where they perceive artifacts along the edges, directly on the screen using an interactive tablet display. They are asked to catalog the severity of artifacts into one of the three levels: Strong, Medium, and Light. We process the hand-marked data into a ringing visibility edge map showing a ringing severity mean opinion score (MOS) at every edge pixel. The perceptual information captured in this experiment, enables us to study the correlation between human perception and local image features, which is an important step towards the goal of developing a non-reference (NR) objective metric to predict the visibility of JPEG ringing artifacts in alignment with the assessments of human observers.
Print defects like banding from a digital press involve not only luminance variation, but also chrominance
variation. As digital presses place one color separation at a time, the contrast and spatial pattern of the print
defects are color-space dependent. Characterizing the color-dependent features of the banding signal enables us to
simulate the banding on natural document images in a more accurate way that matches the characteristics of the
banding generation mechanism within the digital press. A framework is described for color-dependent banding
characterization including the following steps: printing and scanning uniform patches that sample colorant
combinations throughout the input document sRGB color space, extracting banding signals in the CMYK color
space of the target device, and modeling the banding features in a perceptually uniform color space. We obtain
a full banding features LUT for every color point in the input sRGB space by interpolating banding features
extracted from measured color points. The color-dependent banding simulation framework is developed based
on the banding features LUT. Using the information contained in this LUT, a single banding prototype signal is
modulated in a color-space-dependent fashion that varies spatially across the natural document image. Proper
execution of the framework of banding characterization and simulation requires careful calibration of each system
component, as well as implementation of a complete color management pipeline.