This paper focuses on the influence of ambient light on the perceived quality of videos displayed on Liquid Crystal Display (LCD) with local backlight dimming. A subjective test assessing the quality of videos with two backlight dimming methods and three lighting conditions, i.e. no light, low light level (5 lux) and higher light level (60 lux) was organized to collect subjective data. Results show that participants prefer the method exploiting local dimming possibilities to the conventional full backlight but that this preference varies depending on the ambient light level. The clear preference for one method at the low light conditions decreases at the high ambient light, confirming that the ambient light significantly attenuates the perception of the leakage defect (light leaking through dark pixels). Results are also highly dependent on the content of the sequence, which can modulate the effect of the ambient light from having an important influence on the quality grades to no influence at all.
Local backlight dimming is a popular technology in high quality Liquid Crystal Displays (LCDs). In those displays, the
backlight is composed of contributions from several individually adjustable backlight segments, set at different backlight
luminance levels in different parts of the screen, according to the luma of the target image displayed on LCD. Typically,
transmittance of the liquid crystal cells (pixels) located in the regions with dimmed backlight is increased in order to
preserve their relative brightness with respect to the pixels located in the regions with bright backlight. There are
different methods for brightness preservation for local backlight dimming displays, producing images with different
visual characteristics. In this study, we have implemented, analyzed and evaluated several different approaches for
brightness preservation, and conducted a subjective study based on rank ordering to compare the relevant methods on a
real-life LCD with a local backlight dimming capability. In general, our results show that locally adapted brightness
preservation methods produce more preferred visual outcome than global methods, but dependency on the content is also
observed. Based on the results, guidelines for selecting the perceptually preferred brightness preservation method for
local backlight dimming displays are outlined.
The recent technique of local backlight dimming has a significant impact on the quality of images displayed with a LCD screen with LED local dimming. Therefore it represents a necessary step in the quality assessment chain, independently from the other processes applied to images. This paper investigates the modeling of one of the major spatial artifacts produced by local dimming: leakage. Leakage appears in dark areas when the backlight level is too high for LC cells to block sufficiently and the final displayed brightness is higher than it should.
A subjective quality experiment was run on videos displayed on LCD TV with local backlight dimming viewed from a 0° and 15° angles. The subjective results are then compared objective data using different leakage models: constant over the whole display or horizontally varying and three leakage factor (no leakage, measured at 0° and 15° respectively). Results show that for dark sequences accounting for the leakage artifact in the display model is definitely an improvement. Approximating that leakage is constant over the screen seems valid when viewing from a 15° angle while using a horizontally varying model might prove useful for 0° viewing.
Liquid Crystal Display (LCDs) with Light Emitting Diode (LED) backlight is a very popular display technology, used
for instance in television sets, monitors and mobile phones. This paper presents a new backlight dimming algorithm that
exploits the characteristics of the target image, such as the local histograms and the average pixel intensity of each
backlight segment, to reduce the power consumption of the backlight and enhance image quality. The local histogram of
the pixels within each backlight segment is calculated and, based on this average, an adaptive quantile value is extracted.
A classification into three classes based on the average luminance value is performed and, depending on the image
luminance class, the extracted information on the local histogram determines the corresponding backlight value. The
proposed method has been applied on two modeled screens: one with a high resolution direct-lit backlight, and the other
screen with 16 edge-lit backlight segments placed in two columns and eight rows. We have compared the proposed
algorithm against several known backlight dimming algorithms by simulations; and the results show that the proposed
algorithm provides better trade-off between power consumption and image quality preservation than the other algorithms
representing the state of the art among feature based backlight algorithms.
Proc. SPIE. 8436, Optics, Photonics, and Digital Technologies for Multimedia Applications II
KEYWORDS: Point spread functions, Light emitting diodes, Detection and tracking algorithms, Image segmentation, Image resolution, LCDs, LED backlight, Transmittance, High dynamic range imaging, Optimization (mathematics)
Local backlight dimming in Liquid Crystal Displays (LCD) is a technique for reducing power consumption and
simultaneously increasing contrast ratio to provide a High Dynamic Range (HDR) image reproduction. Several backlight
dimming algorithms exist with focus on reducing power consumption, while other algorithms aim at enhancing contrast,
with power savings as a side effect. In our earlier work, we have modeled backlight dimming as a linear programming
problem, where the target is to minimize the cost function measuring the distance between ideal and actual output. In this
paper, we propose a version of the abovementioned algorithm, speeding up execution by decreasing the number of input
variables. This is done by using a subset of the input pixels, selected among the ones experiencing leakage or clipping
distortions. The optimization problem is then solved on this subset. Sample reduction can also be beneficial in
conjunction with other approaches, such as an algorithm based on gradient descent, also presented here. All the proposals
have been compared against other known approaches on simulated edge- and direct-lit displays, and the results show that
the optimal distortion level can be reached using a subset of pixels, with significantly reduced computational load
compared to the optimal algorithm with the full image.
Traditionally, algorithm-based (objective) image and video quality assessment methods operate with the numerical
presentation of the signal, and they do not take the characteristics of the actual output device into account. This is a
reasonable approach, when quality assessment is needed for evaluating the signal quality distortion related directly to
digital signal processing, such as compression. However, the physical characteristics of the display device also pose a
significant impact on the overall perception. In order to facilitate image quality assessment on modern <i>liquid crystal</i>displays (LCD) using light emitting diode (LED) backlight with local dimming, we present the essential considerations
and guidelines for modeling the characteristics of displays with <i>high dynamic range</i> (HDR) and locally adjustable
backlight segments. The representation of the image generated by the model can be assessed using the traditional
objective metrics, and therefore the proposed approach is useful for assessing the performance of different backlight
dimming algorithms in terms of resulting quality and power consumption in a simulated environment. We have
implemented the proposed model in C++ and compared the visual results produced by the model against respective images displayed on a real display with locally controlled backlight units.
Traditional mechanisms for congestion control in multimedia streaming systems reduce the data transmission rate when
congestion is detected. Unfortunately, decreasing the rate of the media stream also decreases the media quality, but it is
the only way to combat congestion when it is caused by overwhelming traffic that exceeds the capacity of the network.
However, if the bottleneck is a wireless link, congestion is often derived from retransmissions caused by bit errors in the
radio link. If this is the case, it might be beneficial not to reduce the transmission rate, but allow delivery of packets
containing bit errors up to the application layer first. In this scenario, the quality of media will be impacted by bit errors
instead of lower coding rate. In this paper, we propose a system concept allowing bit errors in packets in order to relieve
congestion. We have built a simulation to compare the performance of the proposed system against traditional
congestion control. The results show that the proposed approach can improve the overall performance both by increasing
the throughput over the wireless and improving the perceived video quality in terms of peak signal-to-noise ratio
This paper presents two approaches to efficient service development for Internet Telephony. In first approach we consider services ranging from core call signaling features and media control as stated in ITU-T's H.323 to end user services that supports user interaction. The second approach supports IETF's SIP protocol. We compare these from differing architectural perspectives, economy of network and terminal development, and propose efficient architecture models for both protocols. In their design, the main criteria were component independence, lightweight operation and portability in heterogeneous end-to-end environments. In proposed architecture, the vertical division of call signaling and streaming media control logic allows for using the components either individually or combined, depending on the level of functionality required by an application.