8 February 1989 Homogeneous And Layered Alternating Projection Neural Networks
Author Affiliations +
Proceedings Volume 0960, Real-Time Signal Processing for Industrial Applications; (1989) https://doi.org/10.1117/12.947804
Event: SPIE International Symposium on Optical Engineering and Industrial Sensing for Advance Manufacturing Technologies, 1988, Dearborn, MI, United States
Abstract
We consider a class of neural networks whose performance can be analyzed and geometrically visualized in a signal space environment. Alternating projection neural networks (APNN's) perform by alternately projecting between two or more constraint sets. Criteria for desired and unique convergence are easily established. The network can be taught from a training set by viewing each library vector only once. The network can be configured as either a content addressable memory (homogeneous form) or classifier (layered form). The number of patterns that can be stored in the network is on the order of the number of input and hidden neurons. If the output neurons can take on only one of two states, then the trained layered APNN can be easily configured to converge in one iteration. More generally, convergence is at an exponential rate. Convergence can be improved by the use of sigmoid type nonlinearities, network relaxation and/or increasing the number of neurons in the hidden layer. The manner in which the network generalizes can be directly evaluated.
© (1989) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
R. J. Marks, S. Oh, L. E. Atlas, J. A. Ritcey, "Homogeneous And Layered Alternating Projection Neural Networks", Proc. SPIE 0960, Real-Time Signal Processing for Industrial Applications, (8 February 1989); doi: 10.1117/12.947804; https://doi.org/10.1117/12.947804
PROCEEDINGS
PAGES


SHARE
RELATED CONTENT


Back to Top