Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

To resolve issues starting MATLAB on Mac OS X 10.10 (Yosemite) visit: http://www.mathworks.com/matlabcentral/answers/159016

Simulating a Markov chain

Asked by John on 4 Jan 2013

Hello,

Would anybody be able to help me simulate a discrete time markov chain in Matlab?

I have a transition probability matrix with 100 states (100x100) and I'd like to simulate 1000 steps with the initial state as 1.

I'd appreciate any help as I've been trying to do this myself all week with no success and have not been able to source suitable code online.

Kind Regards

John

0 Comments

John

Tags

Products

No products are associated with this question.

3 Answers

Answer by Sean de Wolski on 4 Jan 2013
Edited by Sean de Wolski on 4 Jan 2013

If you also have the emissions matrix, you can use hmmgenerate()

More

Pseudo-ish-code (from my understanding, (disclosure: not a Markov Model expert by any means))

Use a for-loop to loop n times for length you want. S

transC = [zeros(size(trans,1),1), cumsum(trans,2)]; %cumulative sum of rows, we will use this to decide on the next step.
n = 10;
states = zeros(1,n); %storage of states
states(1) = 1; %start at state 1 (or whatever)
for ii = 2:n
   %Generate a random number and figure out where it falls in the cumulative sum of that state's trasition matrix row
   [~,states(ii)] = histc(rand,transC(states(ii-1),:));
end

15 Comments

DEVANAND on 2 May 2013

The above code was for First order markov chain right ? How can I extend this to N th order markov chain.

Shashank Prasanna on 2 May 2013

Please create a new question, this way it gets more visibility and a better way to track question and answers.

Image Analyst on 1 Jun 2013

John, is one of the answers "Acceptable"? If so, mark it "Accepted"

Sean de Wolski
Answer by mona faraji on 1 Jun 2013
Edited by mona faraji on 1 Jun 2013

chack the following for a 2*2 transition matrix and for 1000 states begining at 1:

transition_probabilities = [0.1 0.9;0.8 0.2]; starting_value = 1; chain_length = 1000;

    chain = zeros(1,chain_length);
    chain(1)=starting_value;
    for i=2:chain_length
        this_step_distribution = transition_probabilities(chain(i-1),:);
        cumulative_distribution = cumsum(this_step_distribution);
        r = rand();
        chain(i) = find(cumulative_distribution>r,1);
    end
    %  provides chain = 1 2 1 2 1 2 1 2 1 1 2 1 2 1 2....

0 Comments

mona faraji
Answer by Paul Fackler on 21 Aug 2013

You can simulate a Markov chain using the function ddpsimul in my CompEcon toolbox available at www4.ncsu.edu/~pfackler/compecon

0 Comments

Paul Fackler

Contact us