We compare computer simulation results of optical correlator performance using synthetic discriminantfunction filters encoded in binary phase versus ternary phase and amplitude for distortion-tolerant pattern recognition. We examine two different ternary filter formulations designed to enhance discrimination and SNR. The simulated situation is for very similar in-class and out-of-class images, which makes discrimination between the two sets difficult. The ultimate performance criterion of interest is the probability of correct identification in the presence of image noise, which we address as a function ofthe range of distortion tolerance offered by the filters. We find that the ternary filters offer improved system performance and greater possible distortion range in comparison to the binary filters, and in particular that the ternary filters with a region of support designed to enhance SNR have the best performance for the image sets studied here. Knowledge of the out-of-class images allows the filter designer to create filters that maximize the probability of correct identification. We present numerical examples of this performance figure for two sets of training images.