19 March 2009 Information-theoretic feature extraction and selection for robust classification
Author Affiliations +
Abstract
Classification performance of recognition tasks can be improved by selection of highly discriminative features from the low-dimensional linear representation of data. High-dimensional multivariate data can be represented in lower dimensions by unsupervised feature extraction techniques which attempts to remove the redundancy in the data and/or resolve the multivariate prediction problems. These extracted low-dimensional features of raw data may not ensure good class discrimination, therefore, supervised feature selection methods motivated by information-theoretic approaches can improve the recognition performance with lesser number of features. Proposed hybrid feature selection methods efficiently selects features with higher class discrimination in comparison to feature-class mutual information (MI), Fisher criterion or unsupervised selection using variance; thus, resulting in much improved recognition performance. Feature-class MI criterion and hybrid feature selection methods are computationally scalable and optimal selectors for statistically independent features.
© (2009) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Chandra Shekhar Dhir, Soo Young Lee, "Information-theoretic feature extraction and selection for robust classification", Proc. SPIE 7343, Independent Component Analyses, Wavelets, Neural Networks, Biosystems, and Nanoengineering VII, 73430H (19 March 2009); doi: 10.1117/12.822569; https://doi.org/10.1117/12.822569
PROCEEDINGS
12 PAGES


SHARE
Back to Top