17 March 2017 Comparative study of feature selection with ensemble learning using SOM variants
Author Affiliations +
Proceedings Volume 10341, Ninth International Conference on Machine Vision (ICMV 2016); 103410Z (2017) https://doi.org/10.1117/12.2268538
Event: Ninth International Conference on Machine Vision, 2016, Nice, France
Abstract
Ensemble learning has succeeded in the growth of stability and clustering accuracy, but their runtime prohibits them from scaling up to real-world applications. This study deals the problem of selecting a subset of the most pertinent features for every cluster from a dataset. The proposed method is another extension of the Random Forests approach using self-organizing maps (SOM) variants to unlabeled data that estimates the out-of-bag feature importance from a set of partitions. Every partition is created using a various bootstrap sample and a random subset of the features. Then, we show that the process internal estimates are used to measure variable pertinence in Random Forests are also applicable to feature selection in unsupervised learning. This approach aims to the dimensionality reduction, visualization and cluster characterization at the same time. Hence, we provide empirical results on nineteen benchmark data sets indicating that RFS can lead to significant improvement in terms of clustering accuracy, over several state-of-the-art unsupervised methods, with a very limited subset of features. The approach proves promise to treat with very broad domains.
© (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Ameni Filali, Chiraz Jlassi, Najet Arous, "Comparative study of feature selection with ensemble learning using SOM variants", Proc. SPIE 10341, Ninth International Conference on Machine Vision (ICMV 2016), 103410Z (17 March 2017); doi: 10.1117/12.2268538; https://doi.org/10.1117/12.2268538
PROCEEDINGS
5 PAGES


SHARE
Back to Top