Access to eBooks is limited to institutions that have purchased
or currently subscribe to the SPIE eBooks program. eBooks are not
available via an individual subscription. SPIE books (print and
digital) may be purchased individually on
Contact your librarian to recommend SPIE eBooks for your organization.
Chapter 11: Nonlinear Features for Improved Pattern Recognition
The general approach to pattern recognition includes the step of feature extraction (forming linear or nonlinear combinations of the input data such as iconic pixel gray-scale values). The resultant features are then fed to a classifier. Standard feature extraction methods include the Fisher discriminant, Fukunaga-Koontz (FK) features, and Karhunen-Loeve (KL) or principal component analysis (PCA) features. The Fisher linear discriminant and standard discriminant analysis yields fewer discriminant feature vectors than the number of classes (the Fisher discriminant can provide only one feature vector when two or more classes are present). In addition, the discriminant vectors are not guaranteed to be orthogonal when the number of classes is more than two. Orthonormal discriminant vector (ODV) techniques have been suggested that use the Gram-Schmidt orthogonalization procedure to iteratively yield orthonormal vectors that maximize the Fisher criterion at each iteration. Our new features are found to be more robust. In Section 11.2, we advance new optimal discriminant function (ODF) features that are preferable because they do not assume Gaussian data distributions and can handle the case when one class has multiple clusters in feature space. We have shown that the ODF is the Fisher linear discriminant when the classes are Gaussian distributed.
Online access to SPIE eBooks is limited to subscribing institutions.