Paper
22 September 2015 Compressed data organization for high throughput parallel entropy coding
Author Affiliations +
Abstract
The difficulty of parallelizing entropy coding is increasingly limiting the data throughputs achievable in media compression. In this work we analyze what are the fundamental limitations, using finite-state-machine models for identifying the best manner of separating tasks that can be processed independently, while minimizing compression losses. This analysis confirms previous works showing that effective parallelization is feasible only if the compressed data is organized in a proper way, which is quite different from conventional formats. The proposed new formats exploit the fact that optimal compression is not affected by the arrangement of coded bits, but it goes further in exploiting the decreasing cost of data processing and memory. Additional advantages include the ability to use, within this framework, increasingly more complex data modeling techniques, and the freedom to mix different types of coding. We confirm the parallelization effectiveness using coding simulations that run on multi-core processors, and show how throughput scales with the number of cores, and analyze the additional bit-rate overhead.
© (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Amir Said, Abo-Talib Mahfoodh, and Sehoon Yea "Compressed data organization for high throughput parallel entropy coding", Proc. SPIE 9599, Applications of Digital Image Processing XXXVIII, 95991K (22 September 2015); https://doi.org/10.1117/12.2191624
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Data modeling

Computer programming

Binary data

Computing systems

Parallel processing

Control systems

Data processing

Back to Top