This is a function tries to obtain the maximum likelihood estimation of Gaussian mixture model by expectation maximization (EM) algorithm.
It works on data set of arbitrary dimensions. Several techniques are applied to avoid the float number underflow problems that often occurs on computing probability of high dimensional data. Also the code is carefully tuned to be efficient by utilizing vertorization and matrix factorization.
This is a widely used algorithm. The detail of this algorithm can be found in many textbooks or tutorials online. Just google EM Gaussian Mixture or you can read the wiki page:
http://en.wikipedia.org/wiki/Expectation-maximization_algorithm
This function is robust and efficient yet the code structure is organized so that it is easy to read.
example:
load data;
label = emgm(x,3);
spread(x,label);
Besides using EM to fit GMM, I highly recommend you to try another submission of mine: Variational Bayesian Inference for Gaussian Mixture Model
(http://www.mathworks.com/matlabcentral/fileexchange/35362-variational-bayesian-inference-for-gaussian-mixture-model) which perform Bayesian inference on GMM. It has the advantage that the number of mixture components can be automatically identified by the algorithm.
For all the question regarding to use the code for image segmentation, you have to orgnize the image into a matrix, where each column is the feature vector of one pixel of the image. For example, if RGB value is used, for a 10x10 image the data matrix is a 3x100 matrix where each column is a vector of RGB value of a pixel. |