1 May 1989 Neural Network Model For Fast Learning And Retrieval
Author Affiliations +
Optical Engineering, 28(5), 285506 (1989). doi:10.1117/12.7976989
Abstract
An approach to learning in a multilayer neural network is presented. The proposed network learns by creating interconnections between the input layer and the intermediate layer. In one of the new storage prescriptions proposed, interconnections are excitatory (positive) only and the weights depend on the stored patterns. In the intermediate layer each mother cell is responsible for one stored pattern. Mutually interconnected neurons in the intermediate layer perform a winner-take-all operation, taking into account correlations between stored vectors. The performance of networks using this interconnection prescription is compared with two previously proposed schemes, one using inhibitory connections at the output and one using all-or-nothing interconnections. The network can be used as a content-addressable memory or as a symbolic substitution system that yields an arbitrarily defined output for any input. The training of a model to perform Boolean logical operations is also described. Computer simulations using the network as an autoassociative content-addressable memory show the model to be efficient. Content-addressable associative memories and neural logic modules can be combined to perform logic operations on highly corrupted data.
Henri H. Arsenault, Bohdan Macukow, "Neural Network Model For Fast Learning And Retrieval," Optical Engineering 28(5), 285506 (1 May 1989). http://dx.doi.org/10.1117/12.7976989
JOURNAL ARTICLE
7 PAGES


SHARE
KEYWORDS
Neural networks

Statistical modeling

Content addressable memory

Logic

Computer simulations

Neurons

Symbolic substitution

Back to Top