Abstract
Traditional neural networks such as backpropagation begin with a set of decision boundaries and optimize the network by moving the boundaries. The problem with this approach is a large number of iterations is required and the network can easily be stuck in a local minima. The algorithm presented here rapidly creates boundaries when necessary and destroys boundaries when they become obsolete. Optimization is achieved by a 'survival of the fittest' boundaries approach. Since the individual boundaries are not optimized the algorithm does not require iterations and trains the network very quickly. The algorithm is well suited for high- dimensional analog inputs and analog outputs.
© (1995) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Jason M. Kinser, Jason M. Kinser, } "Fast analog associative memory", Proc. SPIE 2568, Neural, Morphological, and Stochastic Methods in Image and Signal Processing, (11 August 1995); doi: 10.1117/12.216362; https://doi.org/10.1117/12.216362
PROCEEDINGS
4 PAGES


SHARE
RELATED CONTENT

Fault tolerance of opto-electronic neural networks
Proceedings of SPIE (November 30 1991)
Continuous Time Neural Networks
Proceedings of SPIE (May 02 1988)
Neural networks for matched filter selection and synthesis
Proceedings of SPIE (September 15 1992)
Experimental Studies On Adaptive Optical Associative Memory
Proceedings of SPIE (February 07 1988)

Back to Top