Mixtures of Experts, Using Gaussian Mixture Models for the Gate

This code implements the mixture of expert’s using a Gaussian mixture model for the gate.
689 Downloads
Updated 11 Nov 2014

View License

This code implements using a Gaussian mixture model for the gate. ; the main advantage of this method is that training for the gate uses expected maximization (EM) algorithm or single loop EM algorithm. This is achieved using a Gaussian mixture model for the gate. Other methods use the Softmax Function that does not have an analytically closed form solution, requiring the Generalized Expectation Maximization (GEM) or the double loop EM algorithm. The problems with GEM is that it requires extra computation and the stepsize must be chosen carefully to guarantee the convergence of the inner loop. I used k means clustering for initialization, I find only a small improvement after initialization. If you have any questions or recommendations contact me.

Cite As

Joseph Santarcangelo (2024). Mixtures of Experts, Using Gaussian Mixture Models for the Gate (https://www.mathworks.com/matlabcentral/fileexchange/48367-mixtures-of-experts-using-gaussian-mixture-models-for-the-gate), MATLAB Central File Exchange. Retrieved .

MATLAB Release Compatibility
Created with R2008a
Compatible with any release
Platform Compatibility
Windows macOS Linux
Categories
Find more on Statistics and Machine Learning Toolbox in Help Center and MATLAB Answers

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!
Version Published Release Notes
1.2.0.0

din't upload last time

1.1.0.0

There was an error in the first version, I also improved documentation

1.0.0.0