The recent technique of local backlight dimming has a significant impact on the quality of images displayed with a LCD screen with LED local dimming. Therefore it represents a necessary step in the quality assessment chain, independently from the other processes applied to images. This paper investigates the modeling of one of the major spatial artifacts produced by local dimming: leakage. Leakage appears in dark areas when the backlight level is too high for LC cells to block sufficiently and the final displayed brightness is higher than it should.
A subjective quality experiment was run on videos displayed on LCD TV with local backlight dimming viewed from a 0° and 15° angles. The subjective results are then compared objective data using different leakage models: constant over the whole display or horizontally varying and three leakage factor (no leakage, measured at 0° and 15° respectively). Results show that for dark sequences accounting for the leakage artifact in the display model is definitely an improvement. Approximating that leakage is constant over the screen seems valid when viewing from a 15° angle while using a horizontally varying model might prove useful for 0° viewing.
Liquid Crystal Display (LCDs) with Light Emitting Diode (LED) backlight is a very popular display technology, used
for instance in television sets, monitors and mobile phones. This paper presents a new backlight dimming algorithm that
exploits the characteristics of the target image, such as the local histograms and the average pixel intensity of each
backlight segment, to reduce the power consumption of the backlight and enhance image quality. The local histogram of
the pixels within each backlight segment is calculated and, based on this average, an adaptive quantile value is extracted.
A classification into three classes based on the average luminance value is performed and, depending on the image
luminance class, the extracted information on the local histogram determines the corresponding backlight value. The
proposed method has been applied on two modeled screens: one with a high resolution direct-lit backlight, and the other
screen with 16 edge-lit backlight segments placed in two columns and eight rows. We have compared the proposed
algorithm against several known backlight dimming algorithms by simulations; and the results show that the proposed
algorithm provides better trade-off between power consumption and image quality preservation than the other algorithms
representing the state of the art among feature based backlight algorithms.
Proc. SPIE. 8436, Optics, Photonics, and Digital Technologies for Multimedia Applications II
KEYWORDS: Point spread functions, Light emitting diodes, Detection and tracking algorithms, Image segmentation, Image resolution, LCDs, LED backlight, Transmittance, High dynamic range imaging, Optimization (mathematics)
Local backlight dimming in Liquid Crystal Displays (LCD) is a technique for reducing power consumption and
simultaneously increasing contrast ratio to provide a High Dynamic Range (HDR) image reproduction. Several backlight
dimming algorithms exist with focus on reducing power consumption, while other algorithms aim at enhancing contrast,
with power savings as a side effect. In our earlier work, we have modeled backlight dimming as a linear programming
problem, where the target is to minimize the cost function measuring the distance between ideal and actual output. In this
paper, we propose a version of the abovementioned algorithm, speeding up execution by decreasing the number of input
variables. This is done by using a subset of the input pixels, selected among the ones experiencing leakage or clipping
distortions. The optimization problem is then solved on this subset. Sample reduction can also be beneficial in
conjunction with other approaches, such as an algorithm based on gradient descent, also presented here. All the proposals
have been compared against other known approaches on simulated edge- and direct-lit displays, and the results show that
the optimal distortion level can be reached using a subset of pixels, with significantly reduced computational load
compared to the optimal algorithm with the full image.