4 March 1996 Finite-state neural networks for dimensionality reduction and smooth signal reconstruction
Author Affiliations +
Abstract
A finite-state auto-associative MLP neural network is studied in the context of dimensionality reduction and smooth signal reconstruction. We describe the structure and the training procedure of the finite-state network. One of the desirable properties of the auto-associative MLP is that the variance of the hidden units' outputs can be arranged in a descending order, so that efficient coding of the hidden layer output can be implemented. We provide experimental results to demonstrate that the finite-state network retains this desirable property of its memory-less counterpart. One of the application areas of the auto-associative MLP is image compression. As with other block based image compression techniques, this method cannot avoid the problem of annoying 'blocking effects' in the reconstructed images. We present simulation results to demonstrate that the finite-state auto-associative MLP can be used to achieve effective image data compression while significantly reducing the blocking effects.
© (1996) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Guoping Qiu, Martin R. Varley, "Finite-state neural networks for dimensionality reduction and smooth signal reconstruction", Proc. SPIE 2664, Applications of Artificial Neural Networks in Image Processing, (4 March 1996); doi: 10.1117/12.234253; https://doi.org/10.1117/12.234253
PROCEEDINGS
11 PAGES


SHARE
RELATED CONTENT

Low-bit-rate image compression evaluations
Proceedings of SPIE (June 24 1994)
Pornographic image detection with Gabor filters
Proceedings of SPIE (April 05 2002)
Image compression using a self-organized neural network
Proceedings of SPIE (April 01 1997)
JPEG image enhancement based on adaptive learning
Proceedings of SPIE (January 09 1998)
Subjective evaluation of JPEG XR image compression
Proceedings of SPIE (September 02 2009)

Back to Top