Translator Disclaimer
1 July 1992 Fault tolerance of neural networks with noisy training sets
Author Affiliations +
It is well established that training backpropagation networks with noisy training sets increases the generalization capabilities of the network. Since input set noise is somewhat analogous to faults in the network, networks trained on noisy inputs should exhibit fault tolerance superior to that of similar networks trained on non-noisy inputs. This paper presents results of a study to determine the effect of noisy training sets on fault tolerance. Backpropagation was used to train three sets of networks on 7 X 7 numeral patterns. One set was the control and used noiseless inputs and the other two used two different noisy cases. Several network examples were trained for each of the three cases (no noise, 10% noise, and 20% noise). The noise was injected into each training image uniformly at random, and took the form of toggled (0 to 1 and 1 to 0) pixel values in the binary input images. After learning was complete, the networks were tested for their fault tolerance to stuck-at-1 and stuck-at-0 element faults, as well as weight connection faults. The networks trained on noisy inputs had substantially better fault tolerance than the network trained on noiseless inputs.
© (1992) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Jay I. Minnix "Fault tolerance of neural networks with noisy training sets", Proc. SPIE 1710, Science of Artificial Neural Networks, (1 July 1992);


Back to Top