Translator Disclaimer
1 April 1997 Image compression using a self-organized neural network
Author Affiliations +
In the research described by this paper, we implemented and evaluated a linear self-organized feedforward neural network for image compression. Based on the generalized Hebbian learning algorithm (GHA), the neural network extracts the principle components from the auto-correlation matrix of the input images. To do so, an image is first divided into mutually exclusive square blocks of size m multiplied by m. Each block represents a feature vector of m2 dimension in the feature space. The input dimension of the neural net is therefore m2 and the output dimension is m. Training based on GHA for each block then yields a weight matrix with dimension of m multiplied by m2, rows of which are the eigenvectors of the auto-correlation matrix of the input image block. Projection of each image block onto the extracted eigenvectors yields m coefficients for each block. Image compression is then accomplished by quantizing and coding the coefficients for each block. To evaluate the performance of the neural network, two experiments were conducted using standard IEEE images. First, the neural net was implemented to compress images at different bit rates using different block sizes. Second, to test the neural networks's generalization capability, the sets of principle components extracted from one image was used for compressing different but statistically similar images. The evaluation, based on both visual inspection and statistical measures (NMSE and SNR) of the reconstructed images, demonstrates that the network can yield satisfactory image compression performance and possesses a good generalization capability.
© (1997) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Qiang Ji "Image compression using a self-organized neural network", Proc. SPIE 3030, Applications of Artificial Neural Networks in Image Processing II, (1 April 1997);


Back to Top