The approximation capabilities of two different four-layered neural networks are studied. First, a network with the backpropagation algorithm is analyzed, and its error surface, convergence properties, and network design are considered. An alternative to the backpropagation approach is presented, namely, we construct a network that uses a one- pass algorithm. We show that the proposed network can correctly classify N different patterns with 4 ?N?3 hidden units. We also show that an arbitrarily small approximation error can be obtained for this network by adjusting the appropriate parameters.