1 May 1994 Superscalar Huffman decoder hardware design
Author Affiliations +
Abstract
Huffman coding is one of the most common forms of lossless data compression. Many lossy image compression standards, for example the MPEG and JPEG, use Huffman coding as the back end entropy compressor because of its relatively good compression performance and simple hardware implementation. However, the decoding speed is limited by a feedback loop. For applications that require high speed decoding, such as High Definition Television at about 100 Mbyte/s, this feedback loop can be prohibitively slow. The highest speed conventional `parallel' Huffman decoders decode one complete codeword per look-up table memory cycle. This paper describes three different hardware designs that break through this limit. All three depend on probabilistic modeling of the coded data stream to predict, or speculate, on the values of adjacent codewords. One design uses a single fully specified memory with enough width for two, or more, output tokens. The other two designs use multiple memories each fed by a different portion of the code stream. This superscalar approach leads to average decode rates twice, or more, that of a conventional `parallel' decoder for a simulation of JPEG Huffman token data. The relative performance versus hardware cost is described for each design.
© (1994) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Edward L. Schwartz, Edward L. Schwartz, Martin P. Boliek, Martin P. Boliek, James D. Allen, James D. Allen, David W. Bednash, David W. Bednash, } "Superscalar Huffman decoder hardware design", Proc. SPIE 2186, Image and Video Compression, (1 May 1994); doi: 10.1117/12.173931; https://doi.org/10.1117/12.173931
PROCEEDINGS
11 PAGES


SHARE
Back to Top