Paper
1 August 1990 Removing and adding network connections with recursive-error-minimization equations
Wayne E. Simon, Jeffrey R. Carter
Author Affiliations +
Abstract
One of the key features of Recursive Error Minimization (REM) equations is the efficient computation of the second derivative of mean square error with respect to each connection. The approximate integration of this derivative provides an estimate of the effect of removing or adding connections. A network with a minimum number of connections can then be found for a specific learning task. This has two important consequences. First the explanation of network decisions is much simpler with a minimum net. Second the computational load is a function of the number of connections. Results are presented for learning the English alphabet and for a simpler task learning the first seven letters of the alphabet. 1.
© (1990) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Wayne E. Simon and Jeffrey R. Carter "Removing and adding network connections with recursive-error-minimization equations", Proc. SPIE 1294, Applications of Artificial Neural Networks, (1 August 1990); https://doi.org/10.1117/12.21210
Lens.org Logo
CITATIONS
Cited by 5 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Artificial neural networks

Error analysis

Neural networks

Chlorine

Astronomical engineering

Fourier transforms

Switching

Back to Top