Randomly multiply started optimizations for global problems
Some optimization problems have very simple surfaces to optimize. The optimizer simply proceeds downhill to the unique minimizer and returns happily - all is good in the world. Sadly, more often the objective function has multiple local minimizers, you as the user provide poor starting values, and the optimization returns what is essentially junk for a solution. My response would typically be that you needed to provide better starting values. At that time, I'd also try to explain the idea of a basin of attraction for any minimum. Its the set of points that when used as starting values, will allow a given optimizer to converge to a given local minimum.
Starting values that lie in the basin of attraction of the global minimizer are not always that easy to choose for all problems. One solution is to use a randomly multiply started optimizer. Thus, generate lots of starting values, then start the optimizer from each such point. Or start from only the very best of those sampled points.
RMSEARCH puts a simple framework around this task, automatically generating random samples for you, testing which result in the best initial points, then starting your chosen optimizer at that set of points, finally compiling the results.
RMSEARCH can be used with 7 different optimizers:
fminbnd, fmincon, fminsearch, fminsearchbnd, fminsearchcon, fzero, lsqnonlin.
Fmincon requires the optimization toolbox, while fminsearchbnd and fminsearchcon can be found right here on the file exchange:
http://www.mathworks.com/matlabcentral/fileexchange/loadFile.do?objectId=8277&objectType=file
http://www.mathworks.com/matlabcentral/fileexchange/loadFile.do?objectId=13469&objectType=FILE#
Bug fixed - Single variable optimizations (fzero or fminbnd) had an extra evaluation at x = 0 and x = 1 applied for some problems. If these points were outside the bounds, this would cause a problem. |
||
Fixed a bug in sampling variables that have only lower or only upper bounds. |
||
Fixed a bug in the use of linear inequality constraints. Also fixed a documentation problem - referring to the options structure. |
John D'Errico (view profile)
This is a basic of MATLAB. You need to do the same thing for ANY optimizer, for any numerical integration method, etc.
What happens when you try passing in fun as an argument? If you call it like that:
[xfinal,ffinal,exitflag,xstart] = rmsearch(fun,'fmincon',...);
Look at it from the point of the MATLAB interpreter. MATLAB sees fun as a function, that it tries to execute immediately. It would then try to pass the result of that function call to rmsearch.
But that is NOT what you want to do. You want to pass in the function itself into rmsearch. You do this most easily via function handles, the things with an @ symbol in front of a function name. They also let you create a function on the fly. (Read about function handles in the help.)
So in this case, what you had to do was this:
[xfinal,ffinal,exitflag,xstart] = rmsearch(@fun,'fmincon',...);
Pushkarini (view profile)
Hi, I had a question about the submission. It does not work for me when i submit an external function, for example
function d = fun(x)
d = besselj(x);
end
and use
[xfinal,ffinal,exitflag,xstart] = rmsearch(fun,'fmincon',...);
I get this error
"Not enough input arguments." in fun.
Can you please let me know if I'm using the right syntax?
Thanks!
Mouloud (view profile)
Dear John,
I used your file, but if I put an x0 the rmseach give me a ffinal bad than the result with x0 ?? and xstart very bad compared with the x0
John D'Errico (view profile)
Please stop using the comments for this file for your consulting questions. You CAN send me an e-mail, and I will respond as I said, but these questions no longer have anything to do with this piece of code.
John D'Errico (view profile)
I don't know what is not working well, since "not working well" covers a lot of ground. It could be that you have not called the code properly. It could be an inadequate choice of starting values. It could be that your data simply does not fit the model you have posed. (Just wanting a model to fit does not suffice.) It could be that your data is noisy, so it simply does not support fitting a double exponential model. It could be that the error structure in your data does not fit a normal (Gaussian) error structure as is assumed in any standard fitting tool.
Perhaps you might try sending me your data.
Mr. Sumon (view profile)
May be you are right. I know my data is noise and may model is not so perfect. If you check my data you could realize (Gaussian) that
My data-
x = 1, 2, 4, 5, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 62, 65, 67, 70
y =
38073, 35400, 37861, 36517, 42616, 43543, 37314, 38806, 39158, 41029, 39018, 40157, 37393, 38193, 38283, 36493, 36642, 37155, 36927, 36667, 35587, 35767, 35428, 34861, 34487, 35754, 35510, 34470, 36056, 34080,
31931, 33790, 32996, 33212, 33317, 32024, 32294, 32656, 32329, 29221, 32133, 32203, 31317, 31479, 31461, 29796, 31904, 29912, 26975, 26154, 30428, 35402, 28806, 31860, 29282, 28065, 29256, 26653, 34351, 33510,
32486, 29874,
and my function -
F = 65535*(c(1)*cos(x*pi/180) + c(2)*(exp(((x*pi/180).^2)*(-log10(2))/c(3)^2))./cos(x*pi/180) + c(4))
(the constant 65535 is a guess value)Please check them and let me know your comment.
John D'Errico (view profile)
Sums of exponential models are notoriously ill-posed. Even small amounts of noise in your data or lack of fit in the model will cause problems. Poor starting values make things worse of course. And since rmsearch uses random starting values in the hopes that some of them will yield a good result, you should expect that some or even many of those random starting points will fail to converge.
I would strongly suggest the use of a different tool than this. Use a tool like my fminspleas (also on the FEX), which has the ability to reduce the model by effectively eliminating two of the parameters in the search. That reduction of the search space will greatly improve the robustness of your estimation. Good starting values are still imperative of course, since the problem is still an ill-posed one, but here you need only to provide starting values for x(2) and x(4) in your model.
funlist = {@(coef,xdata) exp(-xdata*coef(1)), ...
@(coef,xdata) exp(-xdata*coef(2))};
[X24,X13] = fminspleas(funlist,x24start,x,y);
Mr. Sumon (view profile)
Dear John,
Actually I am trying to fit some data points with a model function to determine the parameter value (4 parameters). I am using lsqcurvefit function. But I am always getting "Local minimum possible" message and if I change the starting guess the result become different.
That is why I need a such algorithm which help me to determine perfect starting guess. Does your function help me?
My model is like that-
fun = @(x,xdata)x(1)*exp(-x(2)*xdata) + x(3)*exp(-x(4)*xdata);
John D'Errico (view profile)
What about line 402? Please include your error. Show how the code was called.
That I can see, there is no problem with the code on line 402. It works. IF YOU are having a problem with line 402, you may have an error in how you are calling the code, but I cannot guess what your code did wrong. As with any error, please give the person you are asking enough information to diagnose your problem!
Helping me here will help me make the code better, because I can possibly find a way to improve the handling of an error you have committed.
Mr. Sumon (view profile)
I am having problem on rmsearch.m
Would please you take a look on line # 402
John D'Errico (view profile)
Any optimizer that rmsearch can use also can have that limit set, using optimset. Simply define an options structure, then pass it into rmsearch in the appropriate argument. rmsearch will forward these options into the chosen optimizer.
Stephan Koehler (view profile)
works great, but there is one problem. I get an error
"Maximum number of function evaluations has been exceeded..."
How do you set the option for function evaluations here?
thanks!
John D'Errico (view profile)
The issue is that fsolve does not allow bound constraints, and rmsearch works best if it has a bounded region to work from, because it needs to generate random starting points for the solution.
Having said that, this code IS written to work with lsqnonlin. lsqnonlin will try to minimize the sum of squares. If fsolve would have converged to a solution, then lsqnonlin will also converge. If fsolve would have been trapped in a non-solution, then lsqnonlin will "converge", but to a point where the equations are not forced to zero.
So simply check the solutions for true zero results (to within a reasonable tolerance) and discard those that failed to properly converge to your specified tolerance.
Patrick Tai (view profile)
Can this work with fsolve?
I need to have more than one input parameter.
Thanks.
Andre Guy Tranquille (view profile)
Shen Wang (view profile)
Woo...
I think I've already found the answer in your PGM
Thank you for sharing it!
Shen Wang (view profile)
Shen Wang (view profile)
This is a great pgm, thank you so much!
BTW, if I choose "InitialSample" as 100 and all the parameters have no boundries (assume the number of parameters is 5), your pgm will generate 100 initial values from 5-D normal distribution or 100^5 initial values from 5 independent 1-D normal distributions(every 1-D normal distirbution generates 100)?
Thanks for sharing it! :)
Excellent!Thanks John for sharing it.