Paper
20 November 2001 Symmetric table addition methods for neural network approximations
Nihal Koc-Sahan, Jason Schlessman, Michael J. Schulte
Author Affiliations +
Abstract
Symmetric table addition methods (STAMs) approximate functions by performing parallel table lookups, followed by multioperand addition. STAMs require significantly less memory than direct table lookups and are faster than piecewise linear approximations. This paper investigates the application of STAMs to the sigmoid function and its derivative, which are commonly used in artificial neural networks. Compared to direct table lookups, STAMs require between 23 and 41 times less memory for sigmoid and between 24 and 46 times less memory for sigmoid's derivative, when the input operand size is 16 bits and the output precision is 12 bits.
© (2001) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Nihal Koc-Sahan, Jason Schlessman, and Michael J. Schulte "Symmetric table addition methods for neural network approximations", Proc. SPIE 4474, Advanced Signal Processing Algorithms, Architectures, and Implementations XI, (20 November 2001); https://doi.org/10.1117/12.448641
Lens.org Logo
CITATIONS
Cited by 3 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Neural networks

Artificial neural networks

Binary data

Error analysis

Information technology

Algorithm development

Computer arithmetic

Back to Top