1 May 1989 Neural Network Model For Fast Learning And Retrieval
Author Affiliations +
Abstract
An approach to learning in a multilayer neural network is presented. The proposed network learns by creating interconnections between the input layer and the intermediate layer. In one of the new storage prescriptions proposed, interconnections are excitatory (positive) only and the weights depend on the stored patterns. In the intermediate layer each mother cell is responsible for one stored pattern. Mutually interconnected neurons in the intermediate layer perform a winner-take-all operation, taking into account correlations between stored vectors. The performance of networks using this interconnection prescription is compared with two previously proposed schemes, one using inhibitory connections at the output and one using all-or-nothing interconnections. The network can be used as a content-addressable memory or as a symbolic substitution system that yields an arbitrarily defined output for any input. The training of a model to perform Boolean logical operations is also described. Computer simulations using the network as an autoassociative content-addressable memory show the model to be efficient. Content-addressable associative memories and neural logic modules can be combined to perform logic operations on highly corrupted data.
Henri H. Arsenault, Bohdan Macukow, "Neural Network Model For Fast Learning And Retrieval," Optical Engineering 28(5), 285506 (1 May 1989). https://doi.org/10.1117/12.7976989 . Submission:
JOURNAL ARTICLE
7 PAGES


SHARE
RELATED CONTENT

Beyond Pattern Recognition With Neural Nets
Proceedings of SPIE (February 08 1989)
Genetic algorithms-based unipolar IPA model
Proceedings of SPIE (January 23 2002)
Genetic-algorithm-based tri-state neural networks
Proceedings of SPIE (September 16 2002)
Fuzzy based IPA model
Proceedings of SPIE (January 23 2002)

Back to Top