28 March 1995 Finite state residual vector quantization with neural network state prediction
Author Affiliations +
Abstract
The major problems with finite state vector quantization (FSVQ) are the lack of accurate prediction of the current state, the state codebook design, and the amount of memory required to store all state codebooks. This paper presents a new FSVQ scheme called finite-state residual vector quantization (FSRVQ), in which a neural network based state prediction is used. Furthermore, a novel tree-structured competitive neural network is used to jointly design the next-state and the state codebooks for the proposed FSRVQ. The proposed FSRVQ scheme differs from the conventional FSVQ in that the state codebooks encode the residual vectors instead of the original vectors. The neural network predictor predicts the current block based on the four previously encoded blocks. The index of the codevector closest to the predicted vector (in the Euclidean distance sense) represents the current state. The residual vector obtained by subtracting the predicted vector from the original vector is then encoded using the current state codebook. The neural network predictor is trained using the back propagation learning algorithm. The next-state codebook and the corresponding state codebooks are jointly designed using the tree-structured competitive neural network. This joint optimization eliminates the large number of unnecessary states which in turn reduces the memory requirement by several order of magnitude when compared to the ordinary FSVQ.
© (1995) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Syed A. Rizvi, Nasser M. Nasrabadi, "Finite state residual vector quantization with neural network state prediction", Proc. SPIE 2424, Nonlinear Image Processing VI, (28 March 1995); doi: 10.1117/12.205246; https://doi.org/10.1117/12.205246
PROCEEDINGS
12 PAGES


SHARE
RELATED CONTENT


Back to Top