Translator Disclaimer
9 July 1992 Wavelet data compression for neural network preprocessing
Author Affiliations +
Preprocessing is beneficial before classification with neural networks because eliminating irrelevant data produces faster learning due to smaller datasets and due to a reduction of confusion caused by irrelevant data. In this paper we demonstrate a further benefit due to smoothing that may be accomplished at the same time. A common trade off with neural networks is between accuracy of classification of training sets versus accuracy of classification of testing sets not used for training. Classification of testing sets requires the network to interpolate. We show that the smoothing obtained by data compression, by omitting low frequency components of the wavelet transform, can enhance interpolation, thus producing improved classification on testing data sets. A wavelet transform decomposes a signal obtained from a radar simulator into frequency and spatial domains using a Mexican hat wavelet. Varying cut-off frequencies are used in omitting higher frequency components of the wavelet transform. An inverse wavelet transform shows the lest square degradation in signal due to smoothing. We demonstrate that omitting high frequency terms results in faster computation in neural network learning and provides better interpolation, that is increases classification performance with testing data sets. The reasons are explained. The wavelet compression results are compared with using low pass filtering.
© (1992) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Alastair D. McAulay and Jian Tian Li "Wavelet data compression for neural network preprocessing", Proc. SPIE 1699, Signal Processing, Sensor Fusion, and Target Recognition, (9 July 1992);

Back to Top