The general approach to pattern recognition includes the step of feature extraction (forming linear or nonlinear combinations of the input data such as iconic pixel gray-scale values). The resultant features are then fed to a classifier. Standard feature extraction methods include the Fisher discriminant, Fukunaga-Koontz (FK) features, and Karhunen-Loeve (KL) or principal component analysis (PCA) features. The Fisher linear discriminant and standard discriminant analysis yields fewer discriminant feature vectors than the number of classes (the Fisher discriminant can provide only one feature vector when two or more classes are present). In addition, the discriminant vectors are not guaranteed to be orthogonal when the number of classes is more than two. Orthonormal discriminant vector (ODV) techniques have been suggested that use the Gram-Schmidt orthogonalization procedure to iteratively yield orthonormal vectors that maximize the Fisher criterion at each iteration. Our new features are found to be more robust. In Section 11.2, we advance new optimal discriminant function (ODF) features that are preferable because they do not assume Gaussian data distributions and can handle the case when one class has multiple clusters in feature space. We have shown that the ODF is the Fisher linear discriminant when the classes are Gaussian distributed.
Online access to SPIE eBooks is limited to subscribing institutions.