Paper
19 August 1993 Training autoassociative recurrent neural network with preprocessed training data
Arun Maskara, Andrew Noetzel
Author Affiliations +
Abstract
The Auto-Associative Recurrent Network (AARN), a modified version of the Simple Recurrent Network (SRN) can be trained to behave as recognizer of a language generated by a regular grammar. The network is trained successfully on an unbounded number of sequences of the language, generated randomly from the Finite State Automation (FSA) of the language. But the training algorithm fails when training is restricted to a fixed finite set of examples. Here, we present a new algorithm for training the AARN from a finite set of language examples. A tree is constructed by preprocessing the training data. The AARN is trained with sequences generated randomly from the tree. The results of the simulations experiments are discussed.
© (1993) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Arun Maskara and Andrew Noetzel "Training autoassociative recurrent neural network with preprocessed training data", Proc. SPIE 1966, Science of Artificial Neural Networks II, (19 August 1993); https://doi.org/10.1117/12.152645
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Neural networks

Artificial neural networks

Computer programming

Detection and tracking algorithms

Computer science

Data hiding

Image processing

Back to Top