Code covered by the BSD License  

Highlights from
Another Particle Swarm Toolbox

4.6087

4.6 | 29 ratings Rate this file 312 Downloads (last 30 days) File Size: 40.6 KB File ID: #25986
image thumbnail

Another Particle Swarm Toolbox

by

 

01 Dec 2009 (Updated )

Implementation of a PSO algorithm with the same syntax as the Genetic Algorithm Toolbox.

| Watch this File

File Information
Description

Introduction
Particle swarm optimization (PSO) is a derivative-free global optimum solver. It is inspired by the surprisingly organized behaviour of large groups of simple animals, such as flocks of birds, schools of fish, or swarms of locusts. The individual creatures, or "particles", in this algorithm are primitive, knowing only four simple things: 1 & 2) their own current location in the search space and fitness value, 3) their previous personal best location, and 4) the overall best location found by all the particles in the "swarm". There are no gradients or Hessians to calculate. Each particle continually adjusts its speed and trajectory in the search space based on this information, moving closer towards the global optimum with each iteration. As seen in nature, this computational swarm displays a remarkable level of coherence and coordination despite the simplicity of its individual particles.
Ease of Use
If you are already using the Genetic Algorithm (GA) included with MATLAB's Global Optimization Toolbox, then this PSO toolbox will save you a great deal of time. It can be called from the MATLAB command line using the same syntax as the GA, with some additional options specific to PSO. This will allow a high degree of code re-usability between the PSO toolbox and the GA toolbox. Certain GA-specific parameters such as cross-over and mutation functions will obviously not be applicable to the PSO algorithm. However, many of the commonly used options for the Genetic Algorithm Toolbox may be used interchangeably with PSO since they are both iterative population-based solvers. See >> help pso (from the ./psopt directory) for more details.

Features
  * NEW: support for distributed computing using MATLAB's parallel computing toolbox.
  * Full support for bounded, linear, and nonlinear constraints.
  * Modular and customizable.
  * Binary optimization. See PSOBINARY function for details.
  * Vectorized fitness functions.
  * Solver parameters controlled using 'options' structure similar to existing MATLAB optimization solvers.
  * User-defined custom plots may be written using same template as GA plotting functions.
  * Another optimization solver may be called as a "hybrid function" to refine PSO results.

A demo function is included, with a small library of test functions. To run the demo, from the psopt directory, call >> psodemo with no inputs or outputs.

New features and bug fixes will continue to be released until this is made redundant by the release of an official MATLAB PSO toolbox. Bug reports and feature requests are welcome.

Special thanks to the following people for contributing code and bug fixes:
  * Ben Xin Kang of the University of Hong Kong
  * Christian Hansen of the University of Hannover
  * J. Oliver of Brigham Young University
  * Michael Johnston of the IRIS toolbox
  * Ziqiang (Kevin) Chen

Bibliography
  * J Kennedy, RC Eberhart, YH Shi. Swarm Intelligence. Academic Press, 2001.
  * Particle Swarm Optimization. http://en.wikipedia.org/wiki/Particle_swarm_optimization
  * RE Perez, K Behdinan. Particle swarm approach for structural design optimization. Computers and Structures 85 (2007) 1579–1588.
  * SM Mikki, AA Kishk. Particle Swarm Optimization: A Physics-Based Approach. Morgan & Claypool, 2008.

Addendum A
Nonlinear inequality constraints in the form c(x) ≤ 0 and nonlinear equality constraints of the form ceq(x) = 0 have now been fully implemented. The 'penalize' constraint boundary enforcement method is now default. It has been redesigned and tested extensively, and should work with all types of constraints.

See the following document for the proper syntax for defining nonlinear constraint functions: http://www.mathworks.com/help/optim/ug/writing-constraints.html#brhkghv-16.
To see a demonstration of nonlinear inequality constraints using a quadrifolium overlaid on Rosenbrock's function, run PSODEMO and choose 'nonlinearconstrdemo' as the test function.

Addendum B
See the following guide in the GA toolbox documentation to get started on using the parallel computing toolbox.
http://www.mathworks.com/help/gads/genetic-algorithm-options.html#f17234

Addendum C
If you are a beginner hoping to learn to use this toolbox for work or school, here are some essential readings:
  * MATLAB's Optimization Toolbox: http://www.mathworks.com/help/optim/index.html
  * MATLAB's Global Optimization Toolbox: http://www.mathworks.com/help/gads/index.html
  * MATLAB's Genetic Algorithm: http://www.mathworks.com/help/gads/genetic-algorithm.html

Acknowledgements

This file inspired Co Blade: Software For Analysis And Design Of Composite Blades.

Required Products Optimization Toolbox
MATLAB release MATLAB 8.1 (R2013a)
Other requirements Familiarity with the Genetic Algorithm and Direct Search Toolbox would help in using this package. Tested on Windows x86; will probably work for other platforms.
Tags for This File   Please login to tag files.
Please login to add a comment or rating.
Comments and Ratings (146)
22 Apr 2014 Sam

Hi parinya,

Can you email me a copy of your nonlinear constraints function through the Contact Author link? I will have a look at it.

Sam

17 Apr 2014 parinya

Dear Sam,

Thanks for powerful pso toolbox. I have an error after running the code with nonlinear constraint. The error message is:
" Problem is infeasible due to nonlinear constraints"

I have checked that my nonlinear constraints is passed with my initial population that I supplied. What could be the possible place that I can take a look to fix this problem.

Thanks

Parinya

03 Apr 2014 Sam

Thanks for pointing that out, Aman.

b should really be a column vector [2;1] so that it will fit the equation

[1 0 ; 0 1]*[x1; x2] ≤ [2; 1]

however it looks like GA is robust enough to check for and correct that error.

I will add a small piece of input-checking code in the next release so that PSO will yield the same behavior as GA.

02 Apr 2014 Erik

Sam, no problem :)

02 Apr 2014 Aman Parkash

Sam,If PSO toolbox syntax same as GA toolbox so then I have found one a little bugs(but not with GA If using same syanx )for example: If I compare n run both GA n PSO syntax for two variable objective function.... pso(@(x)(x(1)^2+x(2)^2+x(1)),2,[1 0;0 1],[2 1]) showing "hozcat " and "psocheckinitialpopulation" error
...BUT ga(@(x)(x(1)^2+x(2)^2+x(1)),2,[1 0;0 1],[2 1]) result come out

01 Apr 2014 Sam

Aman, I'm glad your problem is working properly now. Sorry for the inconvenience! Erik, you are very welcome; is it OK if I add your name to the list of acknowledgements for this toolbox?

01 Apr 2014 Aman Parkash

Erik,Sam thanks for tell n fix this bug.... now my all results are coming within range..

01 Apr 2014 Aman Parkash

means that Despite giving the bounds within postive range but some reslts was coming out of range i.e. with negative also....

31 Mar 2014 Erik

Sam, thanks for fixing the bug so quickly!

31 Mar 2014 Sam

Erik, I have discovered a typo in one of the helper functions for PSO which is causing the bug that you describe. I have submitted an update which should appear over the next few days. This should also improve performance for anyone who is using lower and upper bound constraints for their optimization problems.

30 Mar 2014 Sam

kerolos, there is an option to use binary inputs (call PSOBINARY instead of PSO) however I haven't implemented any integer inputs yet.

Erik, that is very curious behavior and I am getting the same results as you. Even stranger is getting a negative result if I set LB to less than 1. I will have to do some debugging to get to the bottom of this.

Cristina, sorry I took so long to get back to you. I haven't actually tried to program Pareto fronts before, so I will not be of much help to your problem. I will look into developing a multiobjective version of this toolbox as time allows, however I cannot make any guarantees as medical school has other duties for me to attend to at this time. Let me know if you find a way to make it work!

Matthias, that behavior is a result of the PSO algorithm not preserving the best point of every generation (unlike the genetic algorithm, which does). The algorithm proposed in Kennedy et al's book that I referred to above does not include such elite-preserving behavior, therefore I left it out of my code to maintain fidelity to their code. Ideally the swarm would stabilize in a region close to the global maximum anyway, so most of the time this is not a problem. If you are getting wildly different values between the historical best point found and the final best point, then the swarm has likely been terminated before it has found a stable equilibrium and I would not rely on those results. If the difference is within a very small margin of error, then you may choose which result to accept.

29 Mar 2014 Kirti Wanjale  
29 Mar 2014 Kirti Wanjale  
29 Mar 2014 Kirti Wanjale  
29 Mar 2014 Kirti Wanjale  
28 Mar 2014 kerolos

thanks

this works great ;

can i use any options to make integer inputs for the four dimensions

thanks

27 Mar 2014 Sam

kerolos, try setting LB to [0, 0, 0, 0] and UB to [1, 1, 1, 1] and avoid using the linear constraints altogether.

27 Mar 2014 kerolos

thanks for this great toolbox

I want some help using this toolbox for optimization of a 4 dimensional problem

I want an example of Aeq and beq

to saticify a constrain that all 4 dimensions values will be between 0 and 1

i tried this
% A=[ 1 ,-1 ;1 ,-1 ;1 ,-1 ;1 ,-1 ] , b=[ 1 ,0 ;1 ,0 ;1 ,0 ;1 ,0 ]
% A=[ 1 , 1 ,1,1 -1,-1,-1,-1] ,b= [ 1 , 1 ,1,1 0,0,0,0]

but no luck

Error using horzcat
Dimensions of matrices being concatenated are not consistent.

Error in psocheckinitialpopulation (line 36)
if (~isempty([Aineq,bineq]) &&
any(Aineq*state.Population(i,:)' - bineq >
options.TolCon)) ...

Error in pso (line 338)
[state,options] = psocheckinitialpopulation(state,...

I read the help

% x = pso(fitnessfcn,nvars,Aineq,bineq)
% Linear constraints, such that Aineq*x <= bineq. Aineq is a matrix of size
% nconstraints x nvars, while b is a column vector of length nvars.
%

could not understand very well

thanks .

27 Mar 2014 kerolos  
13 Mar 2014 Erik

Hi Sam. Thanks for this great function. I have found a small bug I believe. If I try to find the lowest value of a 1/x function between 1 and 10:
pso(@(x)1/x,1,[],[],[],[],1,10)
it returns x = 1. When I set the bound to 1.01 and 10:
pso(@(x)1/x,1,[],[],],[],1.01,10)
the lowest value is found at x = 10.

20 Feb 2014 Cristina

Hi Sam, thank you for the toolbox! I have tried to modify the code in order to get a multiobjective optimizer, by means of a adaptive weighted sum approach for the fitness. How can I write the code, in order to write out the Pareto points? I wrote for each generation k

for k=2:itr

if k>=2
% trypareto=(state.obj1(k)<state.obj1(1:(k-1)))&(state.obj2(k)<state.obj2(1:(k-1)));

%%%%%%%%%%%%%%%%%%%%%%%%
% if trypareto
paretobiet1(k)=state.obj1(k);
paretobiet2(k)=state.obj2(k);
paretofronte=[paretobiet1',paretobiet2'];

end
end

But it does not work well as it writes out equal points.Does anyone have any suggestion? thank you all.
Cristina

10 Feb 2014 longyan

hi, Sam. I am right now using PSO as a tool to minimize my fitness function. Actually i am using binary coding for the fitness function which i have already run it with GA previously, and it works fine. However, when i am using your binary PSO to calculate the same function, it best value in each generation is like random values, no patterns like gradually going down or at least sign of minimizing. I am wondering why is this happening. The binary coding fine for my case, cause it work well in GA. Will you be so kind to answer me this question, or i will post the details if you need to fix my trouble. Hope to get your reply soon.

31 Jan 2014 Matthias

Hello,

I think I found a crucial bug (version 20130702): in my program I write down the test parameters if they fit better than in the iterations before. In my test PSO stopped after fulfilling a break condition and gave me a final parameter set, which fitted less than an intermediate result I wrote down before.

28 Jan 2014 sakthi priya

Hi sam
thank you so much for the code.i want to know can we give circuit netlist as input.

21 Jan 2014 Sam

Aman Parkash, what have you defined for Aineq and bineq? Aineq must have the same number of rows as bineq.

agus mujianto, I did not design my code to run in a Simulink environment, but I have heard from other users who have tried it with success. Your question pertains to aspects of the Simulink model that you are working with, which is not part of my toolbox. Unfortunately I will not be able to provide you with technical support for something that I did not create. I hope you find your answer soon!

21 Jan 2014 Aman Parkash

Hi sam
I m trying to run a one objective with Linear inequalities Constraints ,But If I using ga then rulut coming well with defined ranges but ,but not with pso and showing the following errors:-

Error using horzcat
CAT arguments dimensions are not consistent.

Error in psocheckinitialpopulation (line 36)
if (~isempty([Aineq,bineq]) && any(Aineq*state.Population(i,:)' - bineq >
options.TolCon)) ...

Error in pso (line 338)
[state,options] = psocheckinitialpopulation(state,...

AND I SET THE OPTIONs :-

CognitiveAttraction: 0.5000
ConstrBoundary: 'soft'
AccelerationFcn: @psoiterate
DemoMode: 'off'
Display: 'final'
FitnessLimit: -Inf
Generations: 200
HybridFcn: []
InitialPopulation: []
InitialVelocities: []
KnownMin: []
OutputFcns: {}
PlotFcns: {}
PlotInterval: 1
PopInitRange: '[0;2]'
PopulationSize: 40
PopulationType: 'doubleVector'
SocialAttraction: 1.2500
StallGenLimit: '100'
StallTimeLimit: Inf
TimeLimit: Inf
TolCon: 1.0000e-06
TolFun: 1.0000e-06
UseParallel: 'never'
Vectorized: 'off'
VelocityLimit: []

So tell me what should I do for satisfied these constraints n for getting result.... thanks

21 Jan 2014 agus mujianto

hi dear sam :
i have some problem with particle swarm optimization, i have simulink file and i want to optimize some part with PSO.

the scrip :
% store names
dv_names={'cs_min_pwr','cs_max_pwr','cs_charge_pwr','cs_min_off_time'};
resp_names={'combined_mpgge'};
con_names={'delta_soc','delta_trace','vinf.accel_test.results.time(1)','vinf.accel_test.results.time(2)','vinf.accel_test.results.time(3)','vinf.grade_test.results.grade'};

% define the problem
FUN='obj_fun_control';
NONLCON='con_fun_control';
X0=[5000 45000 5000 65]; % constraint violation (accel)
LB=[0,25000,0,10];
UB=[25000,50000,25000,1000];
A=[];
B=[];
Aeq=[];
Beq=[];

where i must place dv_names, resp_names?because it contain a lot of variable but i just want to optimize 4 variable.
thank you

20 Jan 2014 dab483

hi,does anyone knows where can i get pso toolbox that support multi objective optimization?

18 Jan 2014 Aman Parkash

Hi dear Sam,
When I optimizing the objective function then I screwing in a problem of how to synatax the range of such ange of limit in Matlab script..e.g.
0<theta1<theta2<theta3<theta4<theta5<(pi/2)
please help me
....

06 Jan 2014 Bing  
20 Dec 2013 Sam

Sanjaya, I am glad that you were able to find the answer to your question. After thinking about your problem I have also thought of some improvements to the code that I will implement in the near future.

Natanael, the error looks like it is coming from the objective function which you supplied.

I have recently received some emails from more community members asking for help using this toolbox. As I have mentioned before, I cannot guarantee that I will be able to respond to your particular questions in a timely or sufficient manner. I have provided demonstration functions and comprehensive help documentation with this toolbox, as well as links to further information which is posted in the file description above. Please refer to those sources of information to find the answers to your questions.

If your question is time-sensitive and related to academic homework, please post them to the community newsgroup so that more people will be able to see and respond to your question.

17 Dec 2013 sanjaya

Dear Sam,
First of all I apologize for my strong word to you. I am very sorry. Yes your coding is very useful to me. Before you write I have already taken the ConstrBoundaryas soft instead of penalize and became successful. Thank you very much.

17 Dec 2013 Natanael Acencio Rijo

when i try to optimize my function gives me this error. What could be it?

Swarming...Error using cost (line 3)
Not enough input arguments.

Error in @(p)cost

Error in pso (line 429)
state.Score(i) = fitnessfcn(state.Population(i,:)) ;

17 Dec 2013 Sam

Sanjaya, my apologies, my answer to your second question from last week should have been to try setting the 'ConstrBoundary' option to 'soft' not 'penalize'. This will allow the optimizer to skip over evaluating any particle which finds its way out of the feasible design space. For your problem, that would mean that Simulink would not have to be called for particles which venture outside of your lower and upper bound constraints. I hope this new information does not come too late for you.

02 Dec 2013 Sam

Sanjaya, regarding your second question, try setting the 'ConstrBoundary' option to 'penalize'. Regarding your first question, my code should be able to handle as any variables as you are willing to throw at it, but unfortunately it is not able to handle mixed integer problems. The Genetic Algorithm included in MATLAB's Global Optimzation Toolbox should be able to help you with that.

If you have a time-sensitive question (e.g. academic assignment due), please direct any questions to your professor, teaching assistant, or a classmate. Like most contributors to the File Exchange community, I am not a Mathworks employee and have other responsibilities that I must attend to. I cannot guarantee that I will have the chance to check this page on a regular basis to answer your questions.

Some resources that can help you with any problems that arise while using the toolbox include:
- Genetic Algorithm Toolbox documentation (http://www.mathworks.com/help/gads/genetic-algorithm.html)
- The books, academic paper, and Wikipedia article listed in the "bibliography" section of the file description

02 Dec 2013 sanjaya

Hallo sam
It is very sad that you have not replied my query till yet.Again I am facing another problem with your code which I am explaining.
I am getting the fitness function from a simulink file. The simulink file will run if the variables lie within lower bound and upper bound.But with your code sometimes the variables exceed the lower bound or upper bound so that the simulink file is not able to execute and it shows error.So please help me as soon as possible. Thanks.

24 Nov 2013 sanjaya

Halo Sham,
I have to optimize 16 parameters. Out of which the first nine parameters are in the range of zero to one but the rest seven variables are in the range of one to four (only integers).So please help me to use your codings. Is your coding is useful for mixed integer constraints?Again is it useful for 16 variables?
Anticipating a quick reply.Thanking you.

22 Nov 2013 Sam

Nara and Natiolol,

I had thought about implementing an integer solving method, but never had the time to do so. After giving it some thought, I came up with one idea to get around this: you could convert your integers into binary form and use the psobinary solver.

Mouloud Kachouane, to find the maximum of a function simply add a (-) sign in front of your problem to convert it into a minimum-finding problem.

21 Nov 2013 Nara

can we use it for integers or can we enforce it to choose numbers from a predefined set.

20 Nov 2013 Natialol

Hello Sam,

Quick question. Does this work for integer constraints as well?

14 Nov 2013 Mouloud Kachouane

Hi,
Thank you so much, I like your toolbox.
I want to ask you, what have I to do to search for maximas of 2D a function using PSO (your toolbox).
Thank you a lot!

29 Oct 2013 Sam

Hi Zachary,

I'm glad that you found this toolbox useful in your work! As for citation, the following should do:

Chen, Samuel (2009-13). Another Particle Swarm Toolbox (http://www.mathworks.com/matlabcentral/fileexchange/25986), MATLAB Central File Exchange. Retrieved (whenever you downloaded the version you have).

I got the format from this link: http://blogs.mathworks.com/community/2010/12/13/citing-file-exchange-submissions/

This goes for anyone else too: if you are comfortable, I would love to see some examples of how you've used the toolbox. If you have any publications that this toolbox in has played a part in, please feel free to send me a link or DOI number via the Contact Author page.

01 Oct 2013 Zachary

An excellent optimizer I have used on several projects now.
I am currently writing a paper where I use your optimizer how should I cite it?

25 Aug 2013 Sam

Mohammed, this toolbox uses the same command line syntax as the genetic algorithm toolbox. So you can refer to the document for the Genetic Algorithm toolbox here: http://www.mathworks.com/help/gads/ga.html

23 Aug 2013 Mohammed

Hi Sam, I have tried several times to set the LB,UB, and nvars to match my multi dimensional problem but it doesnt work, would you please " if you have a time" clarify that in steps and what is the maximum number of decision variables that can be solved "effeciently" as Iam developing a stochastic reservoir optimization code and have already written the function but it doesnt yet fit with your pso code

15 Aug 2013 nima  
03 Jul 2013 Sam

Hi Omari, unfortunately my code does not support multi-objective optimization. The GAMULTIOBJ function that comes with the Global Optimization Toolbox would be your best bet.

02 Jul 2013 Omari

hi Sam

i am starting in optimization field, but i wolud like to know if is posible to use your code in a multiobjective optimization? thanks a lot

21 Jun 2013 Sam

Hi Zhaoyi, I think you may have an older version of the toolbox. For one of the releases, I accidentally included a duplicate copy of the psoiterate.m file. If you look in the psopt folder, you should find a copy of psoiterate.m in the psopt folder, and another copy in the /private folder. Simply download the latest version, or delete the older file in the private folder on the version that you have, and the problem should resolve.

21 Jun 2013 Zhaoyi

Hi Sam,
Thanks a lot for your great job. I'm trying to learn to use it. But I got a problem...maybe very silly...

when I type in "pso", it displays :
Swarming...??? Error using ==> psopt\private\psoiterate
Too many input arguments.

Error in ==> pso at 515
state = options.AccelerationFcn(options,state,flag) ;

Everytime I try to use it to solve my own problem, it also shows this message...
Hope you can help me solve this problem...
Will it be due to the MATLAB version? Mine is 2010a...

Thank you a lot!

15 Jun 2013 Sam

Version 20130615 should be online shortly. Parallel computing capability has been implemented, as per many community requests.

22 May 2013 Ben

I have used this for quite a while. Come back to rate it from my experience.

17 May 2013 Sam

Mohammed, make sure that the PopInitRange, LB and UB variables that you set are the correct size. PopInitRange (which is set using psooptimset) should be a 2 x nvars matrix (that's two rows, and nvars columns). LB and UB should both be 1 x nvars, i.e. row vectors.

17 May 2013 Mohammed

Hi Sam, this the error msg: I have tried several times to modify the dimensions but it doesnt work

Index exceeds matrix dimensions.

Error in psocheckpopulationinitrange (line 9)
index(~lowerInf) = LB(~lowerInf) ~= lowerRange(~lowerInf) ;

Error in pso (line 214)
options.PopInitRange = ...

16 May 2013 Sam

Hi Mohammed, what does the error message say?

16 May 2013 Mohammed

Hi Sam, thanks for the nice job. just wanna ask you how can I modify the pso.m to handle a 24 unknown optimization problem?
I have tested your code through different test functions and different PSO parameters adjustments and it works fine, however, when I try high dimensional problem I always fail in matrix dimension and population range problem.
Could you please help me to cope with this issue?

15 May 2013 Sam

Hi Kevin and others,

The latest version (20130515) should be coming online shortly, and re-implements some previous bug fixes which were lost because when I came back to this project, I started working from an older version of the code. I've also made a small change to ensure that only feasible solutions are selected as global optima when the penalty-based constraint enforcement method is used.

Next I plan to work on implementing parallel computing capabilities, as suggested by many users previously.

04 May 2013 kevin

hi Sam, thank you for updating the submission. Do you have a plan to convert them into "mex" version?

03 May 2013 Sam

This is a relatively major update and it has been a long time since I have programmed in MATLAB. Although I ran through the pre-release checklist of tests that I had previously developed and everything seemed to work OK, I will leave the stable 2010 version of the toolbox available on Google Code's Project Hosting service for anyone who has problems with the new release.

Link here: http://code.google.com/p/psomatlab/downloads/list

03 May 2013 Sam

After a three year hiatus, I've had a bit of time to make some updates that I've always wanted to make to this toolbox. I've started by completing the implementation of an alternate constraint enforcement method that was in the works and almost complete when I started medical school back in 2010. This method should work better for nonlinear constraints, and can be activated by setting options.ConstrBoundary to 'penalize' when calling the PSOOPTIMSET function. This update (version 20130502) should appear on the File Exchange in the next few days.

Over the past three years, I have also received many helpful suggestions from community members such as kevin, Michael Johnston and some others who have emailed me in private. Over the coming weeks I will try to set aside some time to implement their excellent suggestions.

13 Nov 2012 kevin

It seems changing all the " if strcmpi(flag,'init')" in pso.m and plotFcns .m functions into " if strcmpi(flag,'init') || ( state.Generation==options.PlotInterval ) " can address the issue; if options.PlotInterval is larger than 1 there will be problem, error messeage requires haxes to be created first.

23 Oct 2012 kevin

Message to Mathworks: will you please work with Sam together to add particle swarm optimization and differential evolution toolbox in the future release? I look forward to this

22 Oct 2012 kevin

another suggestion is, would you please add C0, C1, and C2 adaptive strategy options since these are critical for PSO algorithm's performance. In psooptions.PlotFcns, there better be an additional option of displaying the trend of C0, C1 and C2 with pso generations. Thank you very much for the great job!

21 Oct 2012 kevin

hi Sam, one suggestion on options.HybridFcn:

It is very common to use PSO hybrided with @ga (not only just like ga's options.HybridFcn to use @fmincon), since PSO performs good on global search while GA does better in convergence.

The current pso.m on HybridFcn is similar to Matlab's ga, so it is suitable for @fmincon which requires initial value but may not work for @ga which requires number of variables:

I now find out the reason why @ga does not work when I set PSO options.HybridFcn as @ga:
In the bottom lines of your pso.m there are codes like below:

% Check for hybrid function, run if necessary
% -------------------------------------------------------------------------
if ~isempty(options.HybridFcn) && exitflag ~= -1
[xOpt,fval] = psorunhybridfcn(fitnessfcn,xOpt,Aineq,bineq,...
Aeq,beq,LB,UB,nonlcon,options) ;
end
% -------------------------------------------------------------------------
The xOpt should be initial value for @fmincon, while @ga requires number of variables.
So in order to use @ga correctly, this xOpt may have to be changed into length(xOpt) or max(size(xOpt)).

Is there any general strategy to make sure both @fmincon and @ga can work correctly?

13 Oct 2012 kevin

Sam哥,看你姓陈我来测试一下你会不会中文.我发现PSO计算的时候的一些小问题:1)计算的时候显示最佳值,但是你的显示的应该不是整个过程的最佳,而是每次比较的最小,所以显示的值有时居然突然会变大;2)默认的遗传算法是可以利用上次结果的,比如Options里面把初始种群设置成上次计算的最终种群,计算就能从上次的结果开始,你这个工具箱好像没有这个功能。要是你看不懂可以谷歌翻译,或者下次我写英文的。OK?

11 Oct 2012 kevin

The best PSO toolbox I've ever seen

02 Oct 2012 Marwan

Thanks for your efforts,
your code worked with me well in MATLAB but I am having problems using it in SIMULINK as code generation does not support function handles nor struct data types :)

08 Jun 2012 Sam

Hi everyone, sorry for the lack of updates, I've been quite busy at medical school over the past months and haven't had time to work on this project. Rilin, the error that you encounter could mean that the set of constraints that you defined are not compatible with each other. A simple example: if I defined a set of constraints x < 5 and x > 8, and both had to be satisfied simultaneously, can you see where I might run into problems?

Troy Lim, the best thing to do would be to look at the product documentation for the MATLAB optimization and global optimization toolboxes. I tried to make my toolbox so that it is fully compatible with the way that they define fitness functions. It is a little tricky to master at first, but with some patience and persistence in going through the examples provided in the MATLAB documentation, you should be able to understand it!

Joe Ajay, I had been meaning to program discrete optimization functionality into this toolbox, but never had the time to do it! I can't promise you anything, but I may have a little more time to make some updates this summer.

10 Mar 2012 Drew Compston  
09 Dec 2011 Rilin

Hi,Sam.Thanks for your good job.
I have a problem when I use the nonlinear contraint.'Problem is infeasible due to nonlinear constraints' occured.But I don't know why. Can you help me? Or anybody knows it can give me a hand. Thanks.

18 Nov 2011 Joe Ajay

Dear Sam, Thanks for this toolbox, It was really helpful in my project. and I've done 4 optimization problems with it. I'd like to work on discrete optimization using PSO. Do you have any update of this toolbox for discrete optimization? If not, what is the other option to go for discrete optimization using PSO.

28 Oct 2011 troy lim

hi Sam,

u have done a very good job for the pso toolbox.

i am a newbie in matlab dealing with a project in college.

i need to optimize the membership functions parameters(total 63 parameters)of flc in simulink by using pso according to observed data result from simulation of the developed system as the fitness function. i found that the most difficult part for me is to formulate the fitness function, or is it possible i can find the best solution without fitness function?
i wish u can help me out or give me some idea as this is the last part of my project towards the end....thx

18 Oct 2011 Rilin

Or, it can just be set in the nonlinear constraint fucntion file?

18 Oct 2011 Rilin

Hi,Sam. I have a problem to use PSO to find C=[c1,c2,c3,c4].The contraint is A*W=1,where A=[a b],W=[c1*exp(jc3),c2*exp(jc4)].How can I set Aeq. Thanks for your answer.

17 Oct 2011 Sam

Due to several requests, I will be looking at fixing some issues with the 'penalize' constraint method over the coming weeks, so that this algorithm will work properly for problems with nonlinear constraints. Stay tuned for updates.

17 Oct 2011 fenfen Xiong

or some one who has code that can deal with optimization with non-linear constraints at hand, would u please send me a copy?
Much thanks!

my email: xiongfenfen@gmail.com

14 Sep 2011 besbesmany besbesmany

my variables is
LB vector of zeros
UB vector of ones

Aeq is matrix containing zeros and ones
Beq is vecotr of ones
my initial value is matrix of any number from 0 to 1

penalize is the only method give me output but not in range of LB, uB and Aeq, Beq
all other constraint method return the same initial value
is thiere any way to have correct result from pso, any updated toolbox you will release soon?

i've correct result from ga toolbox but i want to check other algorithms
is thier any matlab algorithm solve same problem other than ga, fmincon

any direction will be appreciated

Thanks so much Sam

13 Sep 2011 Sam

Sorry that it's not working for you! The 'penalize' constraint method is unfinished, and it doesn't quite work yet: I took that feature out of the documentation and put in a warning message in the last revision of this toolbox when I realized that it still had problems.

If the 'soft' constraint method is giving you the same initial value with no change, then it could be because of several reasons:

1. None of your initial particle positions are feasible (so they are all set to infinity, using the 'soft' constraint enforcement method)
2. Your objective function is throwing an error with the given input vector (could be related to reason #1 above)
3. The design space is flat in the region of the initial particle positions

Are you using discrete or binary variables? Right now my PSO toolbox doesn't support design vectors that have a combination of real and discrete components (but the GA toolbox that MATLAB comes with should be able to those with no problem).

13 Sep 2011 besbesmany besbesmany

Dear Sam
Thanks alot for your effort in this toolbox,
i've problem in constraint pso , lb=0 and UB= 1 but the result of pso is not restricted with this bounds

http://www.mathworks.com/matlabcentral/fileexchange/25986-another-particle-swarm-toolbox/content/psopt/psoboundsabsorb.m

also Aeq and beq is not restrict the result

i tried penalized and soft , penalized is out of range of lb, ub, Aeq, beq

soft is give me the same initial value with no change

can you help me in that

04 Jan 2011 Sam

Haydar, I think the RANDI function was introduced in MATLAB r2008b as discussed here: http://www.mathworks.com/matlabcentral/newsreader/view_thread/274544

If you're not able to get a more recent version of MATLAB, I can see if I can release a small update over the weekend that eliminates PSOBINARY's dependency on RANDI. Thanks for the feedback!

03 Jan 2011 Haydar Dag

When using psobinary, the system crashes for not being able to find the function 'randi'.

30 Nov 2010 Nguyen

Dear Sam!
Thank you very much for your help and your efficient toolbox. It works very well for my problems now.

07 Nov 2010 Sam

Hi Nguyen, I don't think that this toolbox can be used with optimtool -- that would require modifications to the optimtool code, which is beyond my capabilities at the moment. If you learn how to use the optimization toolbox functions from the command line, then you should be well-equipped to use this PSO toolbox. Hope your studies go well!

07 Nov 2010 Nguyen

Dear Sam!
I'm student and i do not know well pso. Can you tell me, how can i use pso from optimtool?
Thanks!

24 Oct 2010 Sam

Thanks Oliver, I'll see if I can implement what you suggested for a parallel processing option. I'll have no way of testing it, so I might add a little note saying that the feature is in beta. Glad you found the toolbox helpful!

22 Oct 2010 Erdal Bizkevelci  
22 Oct 2010 Oliver

Sam,

Thanks for your submission, it works wonderfully. And thank you for sticking to familiar syntax for those of us who have been using the optimization toolbox, this really helps with the learning curve.

I saw some comments about taking advantage of parallel processing, but it didn't look like anyone has done anything about it thus far. When I profiled pso using my objective function the big place that things got bogged down was obviously in the many objective function evaluations. As such I changed the code a little bit to evaluate all of the objective function calls in parallel. I changed the following lines from this:

for i = setdiff(1:n,find(state.OutOfBounds))
state.Score(i) = fitnessfcn(state.Population(i,:)) ;
end % for i

to this:

tempstatepop = state.Population;
itinerary = setdiff(1:n,find(state.OutOfBounds));
temp = zeros(length(itinerary),1);
parfor i = 1:length(itinerary)
temp(i) = fitnessfcn(tempstatepop(itinerary(i),:)) ;
end % for i
state.Score(itinerary) = temp;

This is probably a crude way of doing it, but even so I saw a speed up of just under 3 times. Perhaps something similar will be helpful in a future build. Thanks again.

-Oliver

30 Sep 2010 Mike

Dear Sam
i am writing a simple PSO function in matlab, i would like to ask:
if the particle go out of the boundary, is that i should reset the velocity to zero?

21 Sep 2010 Mark Shore

Sam, I will be trying out your toolbox in the near future as time permits. As far as requirements for installed MATLAB toolboxes, as far as I'm concerned, the fewer the better.

As a single commercial-licence user, I have to justify each additional toolbox I purchase (to myself, but still...). The wavelet and signal processing toolboxes were a non-issue. Got the optimization toolbox expressly for John D'Errico's SLM tools, and still sitting on the fence for parallel processing, image processing, mapping toolbox and curve fitting toolboxes, among others. $1000 here, $1000 there, plus maintenance, eventually adds up and can cut into one's desire to test FEX submissions...

20 Sep 2010 Sam

Hey Ben, there's a brief note about that in the help provided for the PSOOPTIMSET m-file (I know, the documentation is scattered all over the place -- one of the things I was hoping to do was to create a more comprehensive help file).

Glad you found the demo helpful, Mark. I was thinking about including the Global Optimization toolbox in the list of requirements, since it would really help to be familiar with GA. But as you found, it's not strictly necessary to run the PSO code.

20 Sep 2010 Mark Shore

...or I could have just downloaded APST, ran the included demo, and easily answered my own question. Yes - as listed - the requirement is the Optimization Toolbox.

20 Sep 2010 Mark Shore

A quick question - Another Particle Swarm Toolbox requires MATLAB's Optimization Toolbox, NOT the Genetic Algorithm (now Global Optimization) Toolbox, correct?

18 Sep 2010 Ben

Hi Sam,

Could you briefly explain how to set the options related to different constraints? Frankly, there is no much word in the .m file.

Thank you,
Ben

18 Sep 2010 Sam

Looks like it will take until Monday for the aforementioned bug fix to be posted.

17 Sep 2010 Sam

Ben, thanks for pointing that out. I was actually experimenting with a new default constraint enforcement method ('penalize'), but it doesn't seem to handle boundary constraints very well. You can fix that problem by setting options.ConstrBoundary to 'soft' or 'absorb'. I'll release a quick patch (should be up by tomorrow) to set the default back to 'soft' so that this doesn't happen to other people.

WANG, are you passing any input arguments to PSOBINARY? PSO will run a default demonstration case without any inputs, but I haven't provided a similar function for PSOBINARY yet. Anyway, the first argument to PSOBINARY should be a pointer to a fitness function that's written by you (it should not actually be 'fitnessfcn', that's just a placeholder used in the documentation for the aforementioned function point). It should be able to accept a 1xnvars vector of 0s and 1s, and returns a fitness value. Make sure that the m-file for the fitness function is in your path when you call PSOBINARY. Hope that helps.

16 Sep 2010 WANG

thanks for your recommendation.i found that my release is lower than required ,so i updated it.then i run pso.m successfully,but when i run psobinary.m ,it appeared that
??? Input argument "fitnessfcn" is undefined.

Error in ==> psobinary at 37
[xOpt,fval,exitflag,output,population,scores] = ...
would you tell me what is the problem and how to solve it.
thank you .best wishes!

16 Sep 2010 Ben

Hi Sam,

There is another problem: sometimes the result goes out of the box constraints (lower & upper boundaries) a lot. Did you meet it before?

Thx

15 Sep 2010 Sam

WANG, the syntax for this PSO toolbox is described in the comments at the top of the file named pso.m. You can type >> help pso from the command line, with the current directory set to where the pso.m file is, to read it, or you could just open the file. As I mentioned before, it should be the same as the syntax for the Genetic Algorithm included with the Global Optimization toolbox, so you can also refer to them: http://www.mathworks.com/help/toolbox/gads/ga.html.

If by BPSO you mean binary PSO, make sure that you're not trying to impose any constraints. Type >> help psobinary to learn the syntax for binary pso, which is slightly different.

t g and satish, I'd like to be able to help you, but school has been very busy for me, and your questions are bigger than I can properly address at this time. Again, I recommend reading the documentation provided with my PSO toolbox, as well as MATLAB's Global Optimization Toolbox, which I recommended to WANG (it's probably better-written and better presented than mine). You could try some of their simple examples to get an idea of how to use the toolboxes. I hope your research goes well!

15 Sep 2010 satish jain

hi sam
i am using basic pso for my function minimization problem, it is working but gbest is going beyond range . could you pl let me know , how and where i can change my code .
thanks
sk jain
satishjain.jain@gmail.com

14 Sep 2010 WANG

hi,sam
Thanks for you hard working and selfless dedication fristly. i am a mater of electrical engineering,and want to use BPSO to optimize a function,but i can't run th psotoolbox successful ly.when i run pso.m,then
??? Function name must be a string.

Error in ==> psooptimset at 180
idx = find(cellfun(@(varargin)strcmpi(varargin,requiredfields{i,1}),...

Error in ==> pso at 171
options = psooptimset(options) ;
couldy you give me more details using the syntax.
thanks a lot!
best wishes!

11 Sep 2010 t g

hi sam,
i am doing my university project on manufacturing cell design using PSO tool. The problem is defined as a part/machine incidence matrix which maps part and machine, and the clustering should be formed block diagonally in order to make the cells. the objective is to minimize exceptional element count (EE). the PSO particle string should contain the cell nos. and index of the string are the machine nos.
Since i am new to this field, facing problem to implement the logic and code. Can anyone help in this regards. Matlab is the interface of the program. the problem is shown in a link http://www.mypicx.com/uploadimg/53506343_09112010_1.jpg

if you can anyway help me out.

04 Sep 2010 Sam

Reposting this on the public thread, in case others have the same issue:

Hey Ben,

The psocreationuniform function in the /private folder will generate an initial population using a uniform random distribution based on options.PopInitRange. If you've got linear or nonlinear (in)equality constraints, this initial population will then get passed on to psocheckinitialpopulation (again in the private folder), and it will ensure that all of the initial points are feasible, moving the ones which aren't. Anyway, if my following answer is unclear, let me know what kinds of constraints are in your problem and I'll see if I can get back to you. I'll be moving over the next couple of days, so it might be a while before I see your next reply:

One problem I did encounter was with the fact that the PopInitRange option is set from 0 to 1 in all dimensions, by default (i.e. repmat([0;1],1,nvars)). This is obviously not representative of all design spaces. This might be a problem if you haven't set any boundary constraints LB and UB. This behavior was in the original genetic algorithm code from MATLAB upon which I based my code, so I left it it as is, in case other people have already written GA code assuming this behavior and want to try it with PSO.

So basically, if you haven't tried this already, you can manually set the PopInitRange option to a reasonable range that fully encompasses your design space. The CognitiveAttraction and SocialAttraction options might also be adjusted, but first see the note about them provided in the psooptimset help. More drastically, you could try editing the psocreationuniform function.

If none of those work, I'd be inclined say that you've run into an inherent limitation of the PSO, i.e. that if you initialize it in a small enough domain of the design space, and the global optimum lies well outside of it, then the swarm is not guaranteed to find its way out of it.

Sam

27 Aug 2010 Ben

Hi Sam,

I using your code to solve a sub-problem in my algorithm. However, when doing intensive testing, I'v found some stability or repeatability problem of your code. In many cases (more than 10%) of a large number of testing, I got different result using the same configuration (same dataset, same objective function, same pso option).

From my observation, this is caused by the initial particles. It was trapped into some unreasonable local minima.

Could you tell me how you generate the initial particles?

Thanks

23 Aug 2010 Sam

Ben, thanks for the suggestion. I'll have a look at it next week when things get a little less busy for me. kaz uki, I'm not familiar with discrete PSO myself so I can't be of much help to you. Try searching for papers about it on Google Scholar or Compendex. Hope your final year project goes well.

21 Aug 2010 kaz uki

i'm newbie about PSO...can somebody help me how to minimize assembly sequence time(product) using Discrete PSO(DPSO) and implement it using matlab...i need this for my degree-final year project.....i dont where to start..so please someone guide me step by step...email me for best....

21 Aug 2010 kaz uki

U can email me at mtaufiq23@gmail.com...i try my best..

20 Aug 2010 Ben

Hi Sam,

A minor suggestion for the plotting part. It would be more convenient, at least for me. Hope it is helpful.

if ~isempty(options.PlotFcns)
%%%% close(findobj('Tag', 'Swarm Plots', 'Type', 'figure'));
hFig = findobj('Tag', 'PSO_Plots', 'Type', 'figure');
if isempty(hFig)
state.hfigure = figure(...
'NumberTitle', 'off', ...
'Name', 'Particle Swarm Optimization', ...
'NextPlot', 'replacechildren', ...
'Tag', 'PSO_Plots' );
else
state.hfigure = hFig;
set(0, 'CurrentFigure', state.hfigure);
end;
end % if ~isempty

19 Aug 2010 Sam

Good eye on psoiterate, Samuel. However the inertia weight does scale by default, from 0.9 at the beginning of the optimization to 0.4 as it reaches the maximum number of iterations. I've updated the toolbox so that the velocity update function can be changed to your own custom function by setting the 'AccelerationFcn' option to the appropriate function pointer. I haven't documented the syntax for this yet, so it might be best to use the default psoiterate function as a template for developing your own velocity update function. It's currently located in the /private directory, but it will be moved to the base directory in future releases.

Also you can now set a time limit (in seconds) for the solver using the 'TimeLimit' option. Default is infinity.

13 Aug 2010 Ben  
08 Aug 2010 Samuel

Implementation details: This PSO version uses a static intertial weight. You can easily change the velocity update function in file "psoiterate.m" if you want to implement some dynamic change, or even using a different PSO technique such is a constriction factor instead of inertial weighting.

06 Aug 2010 George Evers

Mike,

To maximize a function, simply minimize its additive inverse. In other words, maximizing f(x) is mathematically equivalent to minimizing -f(x).

One easy way to do this would be simply to add
"f = -f;" as the last line of your test function.

05 Aug 2010 Mike

Hi Sam,
Your toolbox is prefect.
But now, i want to find the maximum of a function using PSO.
Would you mind helping me?

28 Jul 2010 Samuel

Hello Mr. Sam,

Much thanks for this excellent software. I just have a question about the specifics of your implementation. Are you using an inertia weight in the update velocity, and if so does that weight decrease? It is recommended by some of the original PSO guys in "Defining a Standard for Particle Swarm Optimization" - Daniel Bratton and James Kennedy. It can cause a good space search in beginning and fine tuning toward the end.

Also, how do you deal with particles going over the bound? Are you preventing them from going out entirely, or letting them go out without evaluation the cost function (which will make the particle eventually pull back into the allowable search space)? The reason I ask is that preventing the particles from going out entirely can cause some bias toward the center of the search space.

Thanks again! I have had great success using your implementation.

22 Jul 2010 Sam

Hey everyone, I've been extremely busy with my thesis so I won't be able to provide any technical support for this file in the foreseeable future. I'm glad that so many people have found it useful. If you have any questions, please refer to previous comments, the file description, as well as getting familiar with how to use the Genetic Algorithm included with MATLAB's Global optimization (http://www.mathworks.com/access/helpdesk/help/toolbox/gads/bqe0w5v.html#bqe0w6h-2).

12 Jul 2010 loo cheng

hi,sam
Thanks for you hard working and selfless dedication fristly.
i have encountered some problems when i using psodemo, some errors emergence, i tried to figure out ,but failed lastly, the error indication is below
"??? Error using ==> strcmp
Inputs must be the same size or either one can be a scalar.

Error in ==> isfield at 12
tf = any(strcmp(fieldnames(s),f));

Error in ==> psodemo at 38
if any(isfield(options,{'options','Aineq','Aeq','LB'}))"

10 Jul 2010 chong

hi,sam
i am a new one. i can't run th psotoolbox successful ly. couldy you give me more details using the syntax.
thanks a lot!

02 Jun 2010 Amaraporn

Sam, Many thanks for your kindly reply. When i used the default with test funcion of 12 inputs. The 12 output never reached the known global min. Then I tuned up all possible parameters in your toolbox and found the best parameter set comprising
CognitiveAttraction=1.5
SocialAttraction parameters =1.5
Generation = 300
popsize= 50

which made the 11 out of 12 outputs meet the theoritical min. Till now I could not find any pso parameter set that can bring the output converge to the same point of every runs which is the general expectation for global min. Somehow making boundary more stricly may be helpful, so i'm trying it at the moment.

25 May 2010 Sam

Amaraporn, the swarm is only stable when the sum of the CognitiveAttraction and SocialAttraction parameters is less than 4; if they are 2 and 2 as you've got them, then 2+2 = 4 and the swarm will not converge. Try reducing one or both of them such that their sum is less than 4. I'll add a note (and paper reference) regarding this to the documentation.

Also, try using fewer generations; that will make the inertia reduction parameter scale better. 10,000 is a very large number of generations. Try setting "Generations" to a few hundred, at most. Same with "StallGenLimit" -- what happens when you use the default values?

23 May 2010 Amaraporn

Dear Sam

Just one quick question about pso algorithm parameters used in your toolbox. I have tried to tune
problem.options.Generations eg. 10000
problem.options.CognitiveAttraction eg. 2
problem.options.SocialAttraction eg. 2
problem.options.StallGenLimit eg. 8000
problem.options.PopulationSize eg. 40
and initial inertia eg. 1

but global min could not been found for my objective function of 12 parameters. I'm seeking other important parameters and wondering if you got these following parameters somewhere in the algorithm. "inertia reduction parameter", "bound and velocity fraction","velocity reduction parameter"

07 May 2010 Sam

Glad it helped. 5000 frames sounds like a lot of data, so it might take a long time depending on your computer and the complexity of the calculations. See this document for tips on improving the performance of MATLAB code: http://www.mathworks.com/access/helpdesk/help/techdoc/matlab_env/f9-17018.html

06 May 2010 Amaraporn

Sam

Many thanks for all of your feedbacks :) Now I can use your toolbox with actual function though spent almost a whole day for a clean termination. I'm not a computing guy so a bit wondering about its time consuming. Is it sounds resonable for the obj function which fits the 12 parameter dynamic model to the data of (5000 frames) time history of experimental dynamic motion? Anyway this is a good toolbox for everyone including students from out of field, I confirm :)

04 May 2010 Sam

Amaraporn, don't use PSODEMO to run your actual optimization. I included that function to provide an easy way to visualize how the swarm behaves, but it wasn't intended to be used to run actual optimizations.

Instead, you should call PSO directly using the syntax explained when you type >> help pso. It was designed to be very similar to the Genetic Algorithm (now called "Global Optimization") Toolbox, so it would help if you become familiar with its documentation (the link is provided in one of my previous posts). If you read the help provided with PSO, you'll see that there is no "default" dimensionality. The number of dimensions of the problem (nvars) must be provided by the user, i.e. >> pso(@fitnessfcn, nvars, ...).

04 May 2010 Amaraporn

Dear Sam

You are right about the persistent function, i have corrected it and try with the default options. I found the toolbox terminate with some local mins as follows.
[420.97 420.97 420.97 -302.53 420.97 420.97 420.97 -500 420.97 420.97 420.97 -302.52] ( Theoritical global min is 420.97).

So, I presume that the toolbox can works well with 12 inputs given that the suitable pso variables are defined.
Now I move to use it with my real obj function having single objective function subject to lb and ub for 12 inputs (this objective function already worked with fmincon (alone)). The error when used with you pso toolbox is as follows.

%%%%%%%%%%%%%%%%%%%%%%%%%
Swarming...??? Attempted to access Swarm(3); index out of bounds because numel(Swarm)=2.

Error in ==> vMarkSqr_spineCT_pso_mod at 37
vThoracicTransl=[Swarm(1);Swarm(2);Swarm(3)];

Error in ==> overlaysurface at 13
ZZ(i,j) = fitnessfcn([XX(i,j) YY(i,j)]) ;

Error in ==> psoplotswarmsurf at 30
overlaysurface(state.fitnessfcn,options) ;

Error in ==> pso at 334
state = options.PlotFcns{i}(options,state,flag) ;

Error in ==> psodemo at 61
pso(problem);
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
"Swarm" is my optimized parameterers.
The words "Error in ==> vMarkSqr_spineCT_pso_mod at 37
vThoracicTransl=[Swarm(1);Swarm(2);Swarm(3)];" is all about my objective codes and line 37 is very first line of my main calculation.
I think this is a problem between the default swarm dimension of the toolbox and the input dimension of my obj function. My function starts with simple head line as follows. Anything should be changed? It is not convenient to sent the whole function to you but I can explain how it works as follows.

%%%%%%%%%%%%%%%%%%
function LeastSqr = vMarkSqr_spineCT_pso_mod(Swarm)
%%%%%%%%%%%%%%%%%%

This obj function is the calculation based on multiple transformation matrices similar as the one from biomechanical application which you may have known it as "ankle joint parameters solving using parallel global optimization with particle swarm". In the main obj function, other 3 functions (created by my self) are called and all deals with transformation matrices with each matrix is calculated from (3-5) swarm inputs.

Could you please help comment about this?

03 May 2010 Sam

Amaraporn, Can I ask why you've got sumX as a persistent variable in the test function? That means that its value is retained between calls to the function, so that the fitness value of every particle will be dependant on the fitness of all previously evaluated particles. This is probably not what you want PSO to do. What happens if you instead change that line to sumX = 0?

How does the swarm perform with the default options?

02 May 2010 Amaraporn

Hi Sam,

Many thanks for your feedback:) Now I'm back to start from a test function which its answer is known. I found your toolbox easy to use with 2-11 inputs and suitable populationsize and generation are needed to help the finding of global min. Now, I got problem with 12 inputs, populationsize=50 and generation=from 2000 to 8000. If you could help to comment how to define suitable pso variables to achieve global min.
-----------------------------------
Eror:
The value of the fitness function did not improve in the last 50 generations and maximum constraint violation is less than 1e-006, after 61 generations.
-------------------------------------------------------

The code for test function is below and subject to lb = -500 and ub=500 (as you know, in same dimension with input number).

%%%%%%%%%%%%%%%%%%%%%%%
function f = easyTest(x)
[xSize, Dim] = size(x);
persistent sumX
for j=1:Dim
fx = (-x(j)) * sin(sqrt(abs(x(j))));
sumX = sumX+fx;
end
f = sumX;
%%%%%%%%%%%%%%%%%%%%%%%%

02 May 2010 Sam

Hi Amaraporn, I'll investigate what's going on at line 258. Can I ask what is the exact code you use to set the options structure and then call pso?

I know it's a bit complicated to set all the options for this PSO code -- I wrote it to be very similar to the Genetic Algorithm that's already provided with a MATLAB toolbox, so that people who already know how to that toolbox can easily transfer their code to this PSO algorithm. If anything's unclear, the online documentation for the Genetic Algorithm Toolbox may help: http://www.mathworks.com/access/helpdesk/help/toolbox/gads/f6010dfi3.html

01 May 2010 Amaraporn

Hi Sam,
I'm trying to use your toolbox solve one objective function subjecto lb and ub with 12 inputs but found it is pretty complicated to define suitable option to get the program run. Mostly, i found this error and been trying, now desperate to sort it out.

Swarming...??? Subscript indices must either be real positive integers or logicals.

Error in ==> pso at 258
state.Score(setdiff(1:n,find(state.OutOfBounds))) = ...

Your comment would be very helpful

22 Apr 2010 Sam

Uduakobong, if you've never used PSO before then it's best to take some time to read through the three references listed under "bibliography" in this file's description. It's an interesting problem to mix continuous and discrete variables, but this toolbox isn't capable of that. As there has been some interest, I will implement the ability to solve problems with binary variables soon.

21 Apr 2010 Uduakobong

Hello sam, i am trying to write a PSO program to solve a multiobjective, nonlinear contriant problem. the problem has 3 varibles with 2 of the varibles being discret and one is continous. the thing is i have never used PSO before so i find it difficult ti understand what to do. Please could you give me some pointers thank you

20 Apr 2010 Sam

Sure, you may also be able to use the test functions provided with the two other files I listed under the "Acknowledgments" section.

19 Apr 2010 Albert Lee

Hi Sam

I need to reply the question that you reply to me. Since I am doing a research about the hybrid PSO with the other algorithm, so I need to write out the program and testing on the test functions. One of my program is particle swarm ant colony optimization. Another one is evolutionary particle swarm optimization. So, I hope that some guidelines will be provided in order to solve the problem. thank you.

19 Apr 2010 Sam

Albert, Ant Colony algorithms are quite different from Particle Swarms, although there have been papers published proposing a hybrid of the two algorithms. My particle swarm code by itself does not do ant colony optimization, so I'm not sure what your question is -- are you trying to write an ant colony algorithm, or are you trying to learn how to use somebody else's ant colony toolbox?

The method for defining a fitness function for this PSO toolbox is the same as for other MATLAB optimizers such as GA, FMINCON, or FMINUNC. Any fitness function you write which works with those optimizers should also work for PSO. See this document http://www.mathworks.com/access/helpdesk/help/toolbox/optim/ug/brhkghv-3.html for instructions -- the only thing is, you don't need to provide the gradient/Jacobian or the Hessian to PSO. Note that PSO can only minimize fitness values; if you have a problem where you're trying to maximize f(x), just set g(x) = -f(x) and minimize g(x) with PSO.

19 Apr 2010 Albert Lee

Can I know how to write for the Matlab code for the Particle Swarm Ant Colony Optimization? If the test function is the same as the PSO toolbox. Thank you.

17 Apr 2010 Sam

I've considered adding parallel processing features, but I don't have the toolbox myself so I'd have no way of testing it.

An alternative could be to use a vectorized fitness function, setting options.Vectorized to 'on', and do all the parallel computing tasks in the fitness function, independently of the PSO code.

If you're interested in collaborating to add Parallel Computing capability, I've got an SVN repository with this project going on Google Code Project Hosting, just search for "psomatlab".

I could easily set up PSOOPTIMSET to create the appropriate option, and somebody with the parallel computing toolbox could set up PSO itself to handle it. Let me know if you're interested.

17 Apr 2010 Michael Johnston

Thanks, Sam. Have you considered adding options for parallel processing (for those who have the Parallel Computing Toolbox)?

14 Apr 2010 Sam

Mike -- I've uploaded a new version which should appear tomorrow. You can set options.Display to 'off' to kill all command line output (except for a few warnings and error messages that you'd probably want to know about anyway).

14 Apr 2010 Sam

Mike -- that's a good idea, I can implement the "quiet" mode fairly quickly.

20 Mar 2010 Michael Johnston

Sam -- Thanks very much for providing your code. You've clearly put a lot of work into it. The performance and stability on my end has been flawless thus far.

The only feature request I can think of is adding a field to the options structure to control the verbosity of the output to the command window? Sometimes it's nice to be able to kill this completely.

Best,

Mike

19 Mar 2010 Sam

Hi Karim,

Sorry about that, I know I mentioned the possibility that I might implement binary variables for this toolbox. Over the past few weeks my thesis work has taken me in another direction, so I don't think I'll have the time to do it. The first book listed in the bibliography section of my description: "Swarm Intelligence" by J Kennedy, RC Eberhart and YH Shi, describes in detail how to implement PSO with binary variables, if you're interested.

Sam

14 Mar 2010 karim

Dear Mr sam

Iam eng.karim and i want to thank you about this great tool box and i ask about any news for binary support for this tool box since i work in my Master's degree and i want to use binary pso and i didnt find any matlab code support this. finally thank you again.

11 Jan 2010 Sam

Dear Mr Saeed,

Does your fitness function work with other MATLAB optimization solvers? Please see this document (http://www.mathworks.com/access/helpdesk/help/toolbox/gads/brdvur4.html) for how to write a fitness function for the Genetic Algorithm (which should also work for PSO). Note that this PSO code doesn't support binary (where the only possible values are 0 or 1) or discrete variables yet.

Sam

11 Jan 2010 Mohammed Ahmed Saeed

Dear sir ,How do you do ? i prepare for my Master in optimal relay coordination and if you please i need your help to provide me with a swarm code to determine the optimal relay settings ( i can not do the fitness function for each particle )
Thanks

14 Dec 2009 Sam

Hi Tom,

This webpage (http://www.particleswarm.info/Programs.html) gives a large list of available toolboxes, although I don't know if any of them can handle non-linear constraints, and most of them are not written in MATLAB. Since you're interested, I will work on implementing non-linear constraints for this toolbox in a future release, maybe in about two weeks' time?

Sam

14 Dec 2009 tom

Hi, Sam,
I am interested in PSO with both linear and nonlinear constraints for high-dimention problem, but I found this package can not handle it.
Is there any other PSO toolbox to handle the problems with nonlinear and linear constraints? Please let me know: dr.xinlivu@gmail.com. I am waiting for your reply.

Thanks,
Tommy

05 Dec 2009 Sam

Glad you found it useful. FYI I just found a bug where the swarm doesn't actually comply with the imposed linear constraints. I'm working to fix it as soon as possible.

05 Dec 2009 Hanlin Zhang

This PSO toolbox is very useful for solving constrained optimization problems. Thanks.

Updates
03 Dec 2009

Small bug fix, more detailed description.

03 Dec 2009

Minor bug fixes, more detailed description. Forgot to update the zip file last time.

05 Dec 2009

Major bug fix. New features, including ability to call a hybrid function to further refine the final result of the swarm algorithm. See release notes for complete details.

06 Dec 2009

Bug fix, minor visual improvements. See release notes for details.

10 Dec 2009

Various bug fixes. Implemented 'absorb' style of boundaries for linear constraints. Social and cognitive attraction parameters can now be adjusted through the options structure. See release notes for details.

15 Dec 2009

Nonlinear inequality constraints can now be used, with 'soft' boundaries only; psodemo now has a 'fast' setting requiring less intensive 3d graphics. See release notes for more details.

30 Dec 2009

New features: nonlinear equality constraints; ability to define initial swarm state.

07 Jan 2010

Updated description. Minor performance and robustness improvements.

25 Jan 2010

Robustness improvements, minor bug fixes.

14 Apr 2010

Output to command window can now be suppressed using the options structure. See release notes for details.

22 May 2010

Added support for problems with binary variables. Minor bug fixes. See release notes for details.

18 Aug 2010

A time limit can now be set, and a custom swarm acceleration function can be defined using the 'AccelerationFcn' option in PSOOPTIMSET (default is PSOITERATE). See release notes for more details.

17 Sep 2010

Minor bug fix. Thanks to Ben for pointing this out.

03 May 2013

Implemented an alternative, penalty-based method of constraint enforcement as described in Perez and Behdinan's 2007 paper. See description for details.

03 May 2013

The previous upload was a zip bomb. Rearranged the contents of the *.zip file to behave nicely when unpacked.

14 May 2013

Updated description.

16 May 2013

Merged previously lost updates from version 20100818. Fixed bugs related to nonlinear constraint handling. See the release notes file included with toolbox for details.

16 May 2013

Fixed namespace problem with one of the *.m files.

17 Jun 2013

Implemented parallel computing capability. A few minor improvements. See the included release notes file for details.

08 Jul 2013

Fixed bug involving verbosity checks before displaying warnings.

20 Dec 2013

Minor bug fix and efficiency improvements, related to nonlinear constraint checking code as well as parallel computing capabilities.

01 Apr 2014

Fixed a typo which caused improper handling of bounded constraints in determining initial particle distribution. Thanks to Erik for pointing out this major bug!

Contact us