This class can be used to guess the hidden states that most likely generate N consecutive observations in a Hidden Markov Model.
The first method is "maxLikelihood" where the number of hidden states is limited to two (due to the computation issue). The second method is "viterbi" which implements Viterbi algorithm to reduce the search space. In this case, the number of hidden states
can be more than two.
To learn more about Bayes, Hidden Markov Model and Viterbi algorithm, the author recommends the following link:
EXAMPLE 1 (check the above video at time 20:20)
transitionsProbabiliy = [0.6 0.4; 0.2 0.8];
emissionProbability = [0.6 0.4; 0.2 0.8];
prior = [1/3 2/3].';
observation = [1 0 1];
[selectedPath, probability4EachPath] = Bayes.maxLikelihood(observation, transitionsProbabiliy, emissionProbability, prior);
EXAMPLE 2 (check the above video at time 22:10)
observation = [1 1 0 0 0 1];
[selectedPath, probabilityPaths] = Bayes.viterbi(observation, transitionsProbabiliy, emissionProbability, prior)
Developed By Iman Moazzen, PhD
YouTube Educator at www.sphackswithiman.com
Affiliate Assistant Professor, Concordia University, Canada
Senior Applied Researcher at PAI Health, Vancouver
Please contact me at email@example.com
If I were to get a tattoo, it would be Bayes theorem for sure! I am truly in ah every single time I've seen its application!
Iman (2019). Bayes Estimator (best-must-have tattoo!) (https://www.mathworks.com/matlabcentral/fileexchange/70226-bayes-estimator-best-must-have-tattoo), MATLAB Central File Exchange. Retrieved .