25 March 1998 Use of localized gating in mixture of experts networks
Author Affiliations +
The 'mixture-of-experts (MOE)' is a popular architecture for function approximation. In the standard architecture, each expert is gated via a softmax function, and its domain of application is not very localized. This paper summarizes several recent results showing the advantages of using localized gating instead. These include a natural framework for model selection/adaptation by growing and shrinking the number of experts, modeling of non-stationary environments, improving the generalization performance and obtaining confidence intervals of network outputs. These results substantially increase the scope and power of MOE networks. Several simulation results are presented to support the theoretical arguments.
© (1998) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Viswanath Ramamurti, Viswanath Ramamurti, Joydeep Ghosh, Joydeep Ghosh, "Use of localized gating in mixture of experts networks", Proc. SPIE 3390, Applications and Science of Computational Intelligence, (25 March 1998); doi: 10.1117/12.304812; https://doi.org/10.1117/12.304812

Back to Top