1 March 1998 Comparing the computational complexity of the PNN, the PDM, and the MMNN (M2N2)
Author Affiliations +
Proceedings Volume 3240, 26th AIPR Workshop: Exploiting New Image Sources and Sensors; (1998) https://doi.org/10.1117/12.300049
Event: 26th AIPR Workshop: Exploiting New Image Sources and Sensors, 1997, Washington, DC, United States
Abstract
In classification, the goal is to assign an input vector to a discrete number of output classes. Classifier design has a long history and they have been put to a large number of uses. In this paper we continue the task of categorizing classifiers by their computational complexity as begun. In particular, we derive analytical formulas for the number of arithmetic operations in the probabilistic neural network (PNN) and its polynomial expansion, also known as the polynomial discriminant method (PDM) and the mixture model neural network (M2N2). In addition we perform tests of the classification accuracy of the PDM with respect to the PNN and the M2N2 find that all three are close in accuracy. Based on this research we now have the ability to choose one or the other based on the computational complexity, the memory requirements and the size of the training set. This is a great advantage in an operational environment. We also discus the extension of such methods to hyperspectral data and find that only the M2N2 is suitable for application to such data.
© (1998) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Samir R. Chettri, Yoshimichi Murakami, Isamu Nagano, Jerry Garegnani, "Comparing the computational complexity of the PNN, the PDM, and the MMNN (M2N2)", Proc. SPIE 3240, 26th AIPR Workshop: Exploiting New Image Sources and Sensors, (1 March 1998); doi: 10.1117/12.300049; https://doi.org/10.1117/12.300049
PROCEEDINGS
7 PAGES


SHARE
Back to Top