19 August 1993 Differential theory of learning for efficient neural network pattern recognition
Author Affiliations +
Abstract
We describe a new theory of differential learning by which a broad family of pattern classifiers (including many well-known neural network paradigms) can learn stochastic concepts efficiently. We describe the relationship between a classifier's ability to generalize well to unseen test examples and the efficiency of the strategy by which it learns. We list a series of proofs that differential learning is efficient in its information and computational resource requirements, whereas traditional probabilistic learning strategies are not. The proofs are illustrated by a simple example that lends itself to closed-form analysis. We conclude with an optical character recognition task for which three different types of differentially generated classifiers generalize significantly better than their probabilistically generated counterparts.
© (1993) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
John B. Hampshire, John B. Hampshire, Bhagavatula Vijaya Kumar, Bhagavatula Vijaya Kumar, } "Differential theory of learning for efficient neural network pattern recognition", Proc. SPIE 1966, Science of Artificial Neural Networks II, (19 August 1993); doi: 10.1117/12.152617; https://doi.org/10.1117/12.152617
PROCEEDINGS
20 PAGES


SHARE
Back to Top