16 December 1992 Generalization performance of neural nets in the presence of noisy data
Author Affiliations +
An outstanding problem in the study of adaptive learning is overspecialization of the learning system, and its consequent inability to handle new data correctly. A means of addressing this difficulty is described here. When used in conjunction with standard processes such as backpropagation, it identifies the level of corruption of the training sample, and thus provides a `best fit' to the entire domain of interest, rather than to the training sample alone. This is accomplished by a combination of simulated annealing, bootstrap estimation, and analysis methods derived from statistical mechanics. Its advantage is that data need not be reserved for an independent test set, and thus all available samples are used. A modified generalization error, defined through a thermalization parameter on the training set, provides a measure of the sample space consistent with the network function. A criterion for optimal match between network and sample set is obtained from the requirement that generalization error and training error be consistent. Numerical results are presented for examples which illustrate several distinct forms of data corruption. A quantity analogous to the specific heat in thermodynamic systems is found to exhibit anomalies at values of training error near the onset of overtraining.
© (1992) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Marjorie Klenin, "Generalization performance of neural nets in the presence of noisy data", Proc. SPIE 1766, Neural and Stochastic Methods in Image and Signal Processing, (16 December 1992); doi: 10.1117/12.130828; https://doi.org/10.1117/12.130828

Back to Top