Paper
6 April 1995 Some new competitive learning schemes
James C. Bezdek, Nikhil R. Pal, Richard J. Hathaway, Nicolaos B. Karayiannis
Author Affiliations +
Abstract
First, we identify an algorithmic defect of the generalized learning vector quantization (GLVQ) scheme that causes it to behave erratically for a certain scaling of the input data. We demonstrate the problem using the IRIS data. Then, we show that GLVQ can behave incorrectly because its learning rates are reciprocally dependent on the sum of squares of distances from an input vector to the node weight vectors. Finally, we propose a new family of models -- the GLVQ-F family -- that remedies the problem. We derive algorithms for competitive learning using the GLVQ-F model, and prove that they are invariant to all positive scalings of the data. The learning rule for GLVQ-F updates all nodes using a learning rate function which is inversely proportional to their distance from the input data point. We illustrate the failure of GLVQ and success of GLVQ-F with the ubiquitous IRIS data.
© (1995) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
James C. Bezdek, Nikhil R. Pal, Richard J. Hathaway, and Nicolaos B. Karayiannis "Some new competitive learning schemes", Proc. SPIE 2492, Applications and Science of Artificial Neural Networks, (6 April 1995); https://doi.org/10.1117/12.205158
Lens.org Logo
CITATIONS
Cited by 3 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Prototyping

IRIS Consortium

Quantization

Radon

Data modeling

Fuzzy logic

Lithium

RELATED CONTENT


Back to Top