File Exchange

image thumbnail

GODLIKE - A robust single-& multi-objective optimizer

version 1.4 (465 KB) by

GODLIKE combines 4 global optimizers for both single/multi-objective optimizations

17 Ratings



GODLIKE (Global Optimum Determination by Linking and Interchanging Kindred Evaluators) is a generization of various population-based global optimization schemes. Also, it handles both single- and multi-objective optimization, simply by adding additional objective functions.
GODLIKE solves optimization problems using relatively basic implementations of a genetic algorithm, differential evolution, particle swarm optimization and adaptive simulated annealing algorithms. Its power comes from the fact that these different algorithms run simultaneously (linked), and members from each population are occasionally swapped (interchanged) to decrease the chances of convergence to a local minimizer.
It is primarily intended to increase ROBUSTNESS, not efficiency as it usually requires more function evaluations than any of the algorithms separately. Its also inteded to do away with the need to fine-tune these algorithms each and every time you encounter an optimization problem, AND to generalize optimization itself (it's both a single and multi-objective optimizer), AND to generate simple plots to be used in quick reports etc.



% extended Rosenbrock function
rosen = @(X) sum( 100*(X(:, 2:2:end) - X(:, 1:2:end-1).^2).^2 + (1 - X(:, 1:2:end-1)).^2, 2);

% call GODLIKE
GODLIKE(rosen, -10*ones(1,10), 10*ones(1,10), 'ms')

will produce a reasonably accurate approximation to the global minimum of the 10-dimensional Rosenbrock problem
( sol ~ ([1,1,1,...]), fval ~ 0 )

(multi-objective optimization)

% basic Sin-Cos Pareto front
GODLIKE({@sin;@cos}, 0, 2*pi, [], 'display', 'plot')

will generate a nice plot of the problem's Pareto front. Some more examples are included in the GODLIKE_DEMO.m, included in the submission.

(see the changelog for more detailed changes)
- Moved 'popsize' and 'which_ones' from the function signature to a more intuitive place -- the options
- Improved plotting routines ('display', 'plot')
- Started major overhaul of the code base to make it easier for me to maintain and extend upon

- Objective functions can now accept any 2-dimensional input. Your objective function should accept arguments equal in size to either [lb] or [ub], and return a simple scalar.
- I discovered I made some *severe* mistakes in the implementation of the global optimization algorithms. This caused large inefficiencies or inaccurate results. Most (hopefully all) of these mistakes are corrected now.
- Added 2more options for the algorithms: NetWorkTopology & ReHeat (see doc)
- Changed the [MinDescent] criterion to the more MATLAB-style 'TolX' and 'TolFun' options

Comments and Ratings (55)


Hi Cengiz,
Thanks so much for your reply and contribution!

Cengiz Gunay

Hi Gary,

If it's a single-objective problem, then you can be quite confident that it's a global optimum if you're reaching it for different runs and parameter sets with a similar output. If the outputs are quite different, then you're still stuck at local optima. It's always good to repeat the optimization to find and report a set of ensemble solutions to increase confidence.

Michal: there shouldn't be a problem in parallelizing a single-objective problem. What's the error? Always provide details when asking for help.

Michal Pec

I am wondering how I can perform parallel global optimization of single objective. Are there any specific need of target function when running parallel optimization? I can run without problems serial optimization but when running parallel, I am getting error. Thank you.



Hi Cengiz,

Thanks for your help! Actually I was solving the single-objective optimization problem with several unknown parameters, and different parameters (or solutions) were obtained when I ran the same function. I do not know whether these solutions can be regarded as Pareto front, as these are in the single-objective optimization problem.


Cengiz Gunay

Hi Gary,
No, you can't treat all the solutions as global optima. There can be only one global optimum. Heuristic methods like these will find you local optima that may be close to a global optimum. If you're finding disparate solutions in the parameter space, but an optimum solution in the output, then you could say you're on the optimal Pareto front.


Hi Oldenhuis!

Thanks for your contribution to the excellent toolbox solving nonlinear optimization problem. I have a small question. When I was using the toolbox, sometimes I cannot interpret the solution with the same function inputs, which means that I can obtain different solutions (sometimes the difference is large) when I run the same function with the same inputs for multiple times. I understand that the solutions should be different when implementing the meta-heuristic algorithms, can I regard all of these solutions as global optima, is it normal for the toolbox to result in different solutions with the same inputs?

Thanks a lot! Best regards,


MR.BOX (view profile)


Peifeng Yu

Rody Oldenhuis

Rody Oldenhuis (view profile)

@Seth and @JohnF: thanks for the feedback, I'll look into it. @Seth: nansum() is part of the statistics toolbox, so I'll rework your code to a version without this dependency. Thanks though!


@John F.

For some reason my first comment did not show up. Anyway, I had the same problem with the "tournament_selection" function getting caught in an infinite loop. I believe it was because of one of my objective functions was being met by all of the population. This was causing the crowding distance to be calculated as NaN. I changed lines 77-79 in the "non_dominated_sort" function to the code below.

crowding_dists(guy) = nansum([crowding_dists(guy),...
(sorted_fitnesses(guys_ind+1, m)-sorted_fitnesses(guys_ind-1, m))/...

@John F.

Sorry, the code I changed below is in the "non-dominated_sort" function.

John F.

Is there a max limit to what the upper bound can be? If the upper bound is set to high it seems that the "tournament_selection" function gets stuck in an infinite loop. Has anyone else had this issue? And if so, how can it be fixed? Thank you in advance.

Hello, Thanks Rody for writing this wonderful code. I want to use non-linear constraints for multi-objective problems. Also, I want to use metamodels as constraints. Comments and suggestions are highly appreciated.

Ander Biguri

Ander Biguri (view profile)

John Cee

Forget that... my problem.

Devraj Dutt

Hello, this function works great, thanks for writing it. My question is to do more generally with memory overflows for optimizing things like variables in simulations. I have a short description here and will try to improve it if clarifications are needed. Would be great if I were to receive any suggestions.

Rody Oldenhuis

Rody Oldenhuis (view profile)

@Nikolay: sorry for that, will update

Nikolay Tal

Hello Rody,
Please update the manual that popsize is an option. I struggled some time to discover that this is the reason the optimizer doesn't run :)

Thanks a lot

Rody Oldenhuis

Rody Oldenhuis (view profile)

@Cengiz Hi Cengiz,

I've seen you fork GODLIKE, haven't yet looked at the chages you did.

I believe you made it possible to have the optimizers run in parallel, right?

I'll take a look today, let you know. Thanks for the efforts so far!

- Rody

Cengiz Gunay

Hi Rody,

Welcome back. Have you seen my fork of your project? I can send you a pull request if you agree with my changes.


Rody Oldenhuis

Rody Oldenhuis (view profile)

OK people!

After a (far too long) absence, I'm picking up where I left off here on the file exchange.

I'm currently on a refactoring spree, which should eventually lead to a better manageable code base, not just for my GODLIKE submission.

Notable changes in the 2016/October/24 submission:

- I've addressed the bug reported by John Cee and Harry, which should be fixed now

- I've improved the plotting routines a lot, and updated the demo accordingly

- Moved the "popsize" input argument, which is now an option and automatically chosen by default

Dave Douglas

@Rody: Quit SPAMMING, mate :P

John Cee

Hi, I get this error fairly often when doing multiobjective minimisation on my target function. Can you explain what it means? Often I run with the same function/data and it works fine.

Undefined function or variable 'best'.

Error in pop_multi/tournament_selection (line 284)
pool(i) = best;

Error in pop_multi/iterate (line 94)

Error in GODLIKE (line 199)

Error in BTEM4 (line 129)

Cengiz Gunay

Harry: if you can replicate the error in my fork, I can help debug it:


Harry (view profile)

Undefined function or variable "best" occurs for some test functions on the multi-objective optimiser, unable to figure out why, but seems good for many others.

Getting errors for mean error but not for max error. Unsure of why, outputs are in the same format

Thanks for the submission. A robust implementation of the Multi-Objective Optimization Algorithm.

I have to make a multi-objective optimization . I have a function called statement, my_obj , where there are 2 decision variables and 3 objective functions . Now I call this function by another m-file :

[ s0 , FVAL ] = GODLIKE ( @my_obj , PS , lb , ub , ' ASA ' , options) ;

but I always get this error :

??? Error using == > cellfun
Non - scalar in Uniform output , at index 1 , output 1 .
Set ' UniformOutput ' to false .

How should I give the 3 objective functions to algorithm GODLIKE ? In this moment I in the function statement, my_obj I have had write:

of1 = 1 -E ;
of2 = C ;
of3 = D ;

where E , ​​C , D are dependent on the 2 decision variables .

Who can help me? thanks

Cengiz Gunay

I forked the Github project and added parallel execution with a new "UseParallel" option:

Hi, really great work. My only problem is getting the options to work. For example if I put 'MaxFunEvals',10 or 'Display','On' it seems to just ignore the option. Anyone know why this might be? Other than that it is running well.


Joshua (view profile)

Hello again,
Just a comment: in addition to the modification I made below, I also require that a significant portions of the individual parameters be zero (which I call MAXNUMPAR), i.e. pop{1}.individuals = [ 0 0.2 0 0 0.5 0 0.3 ];. I initially included this change in the line before my previous modification.
This seems to cause problems with the genetic algorithm, as it returns NANs in some of the individuals.
I correct this by moving my MAXNUMPAR to before pop{i}.iterate;.
I'm not sure why the genetic algorithm does this (possibly combinations of the individuals results in an individual of all zeros), and isn't in need of a fix. Just an FYI in case someone else ever tries to do the same thing.
Again, this download is awesome.


Joshua (view profile)

Thank you, Mario.
Thank you, Rody.
I managed to get the modification I wanted to GODLIKE, which makes me very glad, because this program seems very awesome.
To do it, I went to line 198 of GODLIKE and inserted the following code
X = size(pop{i}.individuals,1);
for j = 1:X
pop{i}.individuals(j,:) = ...
clear X
where NORMTOTOTAL is a simple function I wrote to normalize each number in a row vector to the sum of the entire vector.
Thank you!

Rody Oldenhuis

Rody Oldenhuis (view profile)


No, there is no easy way in GODLIKE to implement linear constraints like the one you have; GODLIKE is restricted to problems with bound-constraints only. It is fairly easy to devise a workaround (as suggested by Mario Castro Gama), however, if your problem is not too large, it might be easier to give a try to my minimize() function (just search my author page here on the FEX). This function also supports (non)linear constraints, and has global optimization capability (albeit rudimentary compared to GODLIKE).

Hope this helps, Rody Oldenhuis

Hello Joshua

Maybe you can try to implement inside the godlike algortihm when the new population is created.

Best regards.


Joshua (view profile)

I would like to run this code. My model parameters need to have a combined total of 1. I have tried modifying the code to include a function of mine that does this, but I keep finding new errors.
Is there a simple way to make this code keep the combined total of my model parameters equal to one?


Fabien (view profile)

nice job !


ahmad (view profile)

Dear Roddy

Thank you for sharing your great "godlike" optimisation code in matlab
central. I have two questions:
1. With the current "godlike" demo (set by you) the Multi-objective
optimisation is implemented. How can I set it to run just
single-objective optimisation?
2. How can I set it to run just one heuristic technique?

Thanks in advance

Joe Ajay

Hello Rody, does this tool solve discrete optimization you have any update for solving discrete problems

Sebastien PARIS


I've found that this code is quite helpful in generating useful initial conditions for some constrained optimization problems that involve local minima. Are there any plans to handle nonlinear inequality constraints? Also, a useful addition to the code would be an option for a maximum run time. That way I can set it to run over the weekend easily.


Davide (view profile)

When I try the multi-objective minimization I face some issues.
Let's take as an example this multiobjective function
function a = myfunc(x)
a(1) = sin(x)
a(2) = cos(x)

if I use GODLIKE like this:
it works fine

in this way
optfunc = @(x) myfunc(x)
id does not.

What am I doing wrong in the second case? How should I use it?


Ben (view profile)

How to pass parameters to the objective function?

Davoud Safari

Thanks Rody...But I need the codes for integer programming. How can use these codes for that purpose.. i.e. which part of code, must be changed to achieve my purpose?


Developed function is very nice, but can i handle equality and inequality constraints in GODLIKE? In user manual, i couldn't find anything related to this.

Is it possible to add more constraints into the optimization of godlike? for example: x(1)+x(2)-3*x(3)=10
How could I realize that in matlab?


Robert (view profile)

trade extra cpu cycles for fewer brain cycles.


John (view profile)

I would like to implement this code to fit a non-linear equation to a set of data. The non-linear equation takes 8 paramaters which I would like to optomize. Basically, the sum of least squares approach. Will GODLIKE achieve this? I have tried using it and the paramater guesses are all over the place and do not seem to converge. Thank you!

A bugfix.
set_options.m, at lines 456-457:

elseif strcmpi(option, 'SkipTest')
if ~isnumeric(value)

should be:

elseif strcmpi(option, 'SkipTest')
if ~ischar(value)

Rody Oldenhuis

Rody Oldenhuis (view profile)

Daniel: I am most certainly planning to implement (non)-linear constraints in GODLIKE, that would be pretty convenient indeed! However, I'm VERY busy the next few weeks, so don't expect this change to come anytime soon :)

You could try my other tool OPTIMIZE (also on the file-exchange); it can also optimize problems globally. Perhaps that can give you some results...

Thanks for the 5 stars ! :)


Daniel (view profile)

Nice tool! Any plans for adding non-linear constraint handling? I'm messing with a Lagrangian barrier function (the method the original C++ implementation of NSGAII uses) inside a custom objective function, but it would be nice to have that kind of functionality implemented directly in the optimizer.

Rody Oldenhuis

Rody Oldenhuis (view profile)

Roland: Indeed, I overlooked that issue. It works great, thanks!

About the mex-file: I see no real need for it. Most of the time the real computational cost is in the objective functions, not this optimizer (as it should be)...except maybe for the NSGA-II part, but really I only notice that it's N2-complex if I use huge population sizes...that, and I'm really short on time these days :) But of course, you are more than welcome to do it!

So what changes did you make to allow ML2007a to run it? It would be perfect if more users could use GODLIKE! I think it's a great idea if you incorporate those changes, write a mex file and submit it as an improvement upon my version.

Thanks for the feedback.

Hi, You can make it orders of magnitude faster if You replace in pop_single.m this:

for i = 1:pop.dimensions
% convert column to decimal representation
temp_pop(:, i) = sum(convert_to_dec.*newpop(:, 1:NumBits), 2);
% delete entries
newpop(:, 1:NumBits) = [];

with this:

newpop_startcol = 1;
NumBits2 = NumBits-1;
for i = 1:pop.dimensions
% convert column to decimal representation
temp_pop(:, i) = sum(convert_to_dec.*newpop(:, newpop_startcol : newpop_startcol+NumBits2), 2);
% delete entries
newpop_startcol = newpop_startcol + NumBits;

Column/row deletion is very slow operation, its better to avoid it.

There were also some changes I had to make in order for the script to start in Matlab R2007a.

Are You considering to port it to mex file? I am going to do that but we may get the results sooner if we split the task.


André (view profile)



(linked to GitHub)


[linked to Github]


Lots of changes (too much to list here). See the changelog.txt file in the ZIP.

MATLAB Release
MATLAB 7.7 (R2008b)

Download apps, toolboxes, and other File Exchange content using Add-On Explorer in MATLAB.

» Watch video