Translator Disclaimer
1 March 1992 Efficient activation functions for the back-propagation neural network
Author Affiliations +
The back-propagation algorithm is the most common algorithm in use in artificial neural network research. The standard activation (transfer) function is the logistic function s(x) equals 1/(1 + exp(-x)). The derivative of this function is used in correcting the error signals for updating the coefficients of the network. The maximum value of the derivative is only 0.25, which yields slow convergence. A new family of activation functions is proposed, whose derivatives belong to Sechn (x) family for n equals 1,2,.... The maximum value of the derivatives varies from 0.637 to 1.875 for n equals 1-6, and thus a member of the activation function-family can be selected to suit the problem. Results of using this family of activation functions show orders of magnitude savings in computation. A discrete version of these functions is also proposed for efficient implementation. For the parity 8 problem with 16 hidden units, the new activation function f3 uses 300 epochs for learning when compared to 500,000 epochs used by the standard activation function.
© (1992) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Surender K. Kenue "Efficient activation functions for the back-propagation neural network", Proc. SPIE 1608, Intelligent Robots and Computer Vision X: Neural, Biological, and 3-D Methods, (1 March 1992);


Neural network approach for inventory control
Proceedings of SPIE (November 01 1992)
Neural Network For Optical Flow Estimation
Proceedings of SPIE (March 01 1990)
Techniques for high-performance analog neural networks
Proceedings of SPIE (March 01 1992)
Pattern recognition using stochastic neural networks
Proceedings of SPIE (August 20 1993)
Ho-Kashyap CAAP 1:1 associative processors
Proceedings of SPIE (February 01 1991)

Back to Top