Distortion invariant pattern recognition in optical correlators is
explored using a linear vector space formalism. Special attention is given
to filter design and methods for reducing the clutter problem. The following
three constraints are shown to play an important role in the filter's
performance: (1) maximize energy projected from the training set onto
the filter, (2) use of bipolar filters, and (3) use of higher dimensional filters.
An example algorithm for constructing a distortion invariant filter based
on these constraints is also given.