error in the code for markov chain

1 view (last 30 days)
dpr
dpr on 4 Mar 2012
I am trying to simulate a markov chain with the following code. I keep on having this error:
In an assignment A(I) = B, the number of elements in B and I must be the same. Error in ==> Markov_Chain at 61 Chain(1) = cur_state;
How can I solve it? thanks
function [Chain] = Markvok_Chain( P, initial_state, n )
P = [ 0.85 0.15; 0.05 0.95 ];
initial_state = [0.4 0.6];
n=20;
sz = size(P);
cur_state = round(initial_state);
% % Verify that the input parameters are valid % if (sz(1) ~= sz(2)) error('Markov_Chain: Probability matrix is not square'); end num_states = sz(1);
if (cur_state < 1) | (cur_state > num_states) error('Markov_Chain: Initial state not defined in P') end
for i=1:num_states if (sum(P(i,:)) ~=1 ) error('Markov_Chain: Transition matrix is not valid') end end
% % Create the Markov Chain
Chain(1) = cur_state; for i = 1:n
cur_state = Rand_Vect(P(cur_state,:), 1);
Chain(i) = cur_state;

Accepted Answer

Razvan
Razvan on 4 Mar 2012
I guess you want a matrix Chain which has in the end on each row some probabilities... You try to put a vector curr_state into 1 element Chain(1) that's why you get an error. You should change that to Chain(1, 1:2) = cur_state; and the same 2 lines bellow: Chain(i, 1:2) = cur_state;

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!