Translator Disclaimer
Paper
21 March 2001 Cross-validation in fuzzy ARTMAP neural networks for large sample classification problems
Author Affiliations +
Abstract
In this paper we are examining the issue of overtraining in Fuzzy ARTMAP. Over-training in Fuzzy ARTMAP manifests itself in two different ways: (a) it degrades the generalization performance of Fuzzy ARTMAP as training progresses, and (b) it creates unnecessarily large Fuzzy ARTMAP neural network architectures. In this work we are demonstrating that overtraining happens in Fuzzy ARTMAP and we propose an old remedy for its cure: cross-validation. In our experiments we compare the performance of Fuzzy ARTMAP that is trained (i) until the completion of training, (ii) for one epoch, and (iii) until its performance on a validation set is maximized. The experiments were performed on artificial and real databases. The conclusion derived from these experiments is that cross-validation is a useful procedure in Fuzzy ARTMAP, because it produces smaller Fuzzy ARTMAP architectures with improved generalization performance. The trade-off is that cross-validation introduces additional computational complexity in the training phase of Fuzzy ARTMAP.
© (2001) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Michael Georgiopoulos, Anna Koufakou, Georgios C. Anagnostopoulos, and Takis Kasparis "Cross-validation in fuzzy ARTMAP neural networks for large sample classification problems", Proc. SPIE 4390, Applications and Science of Computational Intelligence IV, (21 March 2001); https://doi.org/10.1117/12.421155
PROCEEDINGS
11 PAGES


SHARE
Advertisement
Advertisement
RELATED CONTENT


Back to Top