could anyone help me to solve the following issue.

1 view (last 30 days)
jaah navi on 7 May 2019
Commented: Walter Roberson on 8 May 2019
I am having two different values of centroids generated randomly by the code to be 3 and 5 for 6 users.
As a result i am getting
centroids =3
centroids =5
in my command window
Now i need to begin my iteration so i used for loop in my code:
for iteration=1:iterations %iterations=3;
distances=zeros(users,centroids,1) %users=6
.
.
end.
But the code executes by taking into consideration only the centroids=5 in for loop and it does not consider centroids=3.
But for me the for loop should execute first by taking centroids=3,then followed by centroids=5 when iterations begins.

KSSV on 7 May 2019
for iteration=1:iterations %iterations=3;
if i == 1
centroids = 3 ;
else
centroids = 6 ;
end
distances=zeros(users,centroids,1) %users=6
.
.
end.

Walter Roberson on 8 May 2019
I spent a while on the code, trying to figure out what the heck you could possibly mean.
The best I was able to come up with is that you have a set of "users" with associated locations, and that you have a number of entities you call "particles", and that for each of those particle entities, you have a "swarm", with the number of members in the swarm to be chosen at random based upon the total number of users you have. (Random number of members in a swarm could be justified for some swarm dynamics, but why the number of members of the swarm should depend upon the number of users is a complete mystery.) Anyhow, each member of the swarm has a current position.
For each iteration, you go through each particle, and for each swarm member for the particle, you compute the distance from each swarm member to each of the users and record that.
But the distance from the swarm member to the users does not affect anything, it just gets computed, so if you had more than one iteration, it would just calculate the same thing again. In particular, the swarm members do not move positions, and no "cost" is calculated, so there is no "goodness" available for a particular swarm position or swarm and so there is no way to evaluate whether one particle is "better" than another particle.
Anyhow, I did come up with code to implement the above... but it does seem a strange thing to do.
If you had a hypothesis that a varying number of "servers" should be evaluated in parallel to optimimally fill the needs of the users, each server having a position, and there being a formula to calculate how good the service was: if that were the case, then instead of allocating a random number of swarm members, it would make more sense to systematically allocate 1 through the maximum number of servers, in which case instead of "particles" you would talk about "replications".
jaah navi on 8 May 2019
But each swarm_pos are different with respect to particles.
As i am using two particles two different swarm_pos are generated,following by computing the distance.
but in my code the distance is being computed with respect to sceond particle and not with respect to first particle.
I am still working on it.
If possibe help me on this.
Walter Roberson on 8 May 2019
Your existing code is too broken for a meaningful correction to be made by us. Too many places where we have no idea why you would even consider doing things the way you are doing them.
That is why I spent a bunch of time trying to figure out what the intended purpose of the code was -- not what the problem was that had to be solved, but rather trying to figure out what it was that you thought you had to implement along the way to solving what-ever it is you are trying to solve. I came up with a self-consistent framework for what you might have been trying to do, and I implemented it.
If my description of the goal of your code (not the current operation of your code but what the aim of the code is) is not correct, then I am not able to assist you, as your code as posted has no consistent logic in the form it is in, and cannot be repaired except by re-thinking what the purpose of the code is and going back to start over.