Description |
The aim of this simple benchmark is to illustrate the interest of restarting Nelder-Mead locally, from the last solution found, until no improvement is reached (to a given accuracy).
Also, it shows that fminsearch has great difficulties at minimizing the most simple, smooth quadractic, objective function used. But restarting it locally corrects that. On the other hand, Nick Higham implementation of Nelder-Mead works fine, and the accuracy reached is also further improved by restarting it locally. Note that it may still happen that fminsearch performs better than nmsmax on other problems.
Anyhow in theory, amongst direct search methods, one should not use even the restarted NM but rather MADS (see http://www.gerad.ca/nomad/Project/Home.html and references therein), which has guaranteed convergence even on non-smooth Clarke subdifferentiable objective functions.
The restarted NM will also lead in practice to locally optimal solutions, although this is not theoretically guaranteed. It may fail in very particular of difficult situations. The reason for its good convergence properties in practice is that restarting it regenerates its search simplex and in the end many search directions are covered, which is a crude alternative to the POLL step of MADS (which is the step ensuring convergence).
So the restarted NM will perform well even on non-smooth or discontinuous objective functions (not illustrated with this benchmark, other benchmarks are available on e.g. http://arxiv.org/abs/1104.5369 or with the files hyperlinked hereunder).
We put forward the restarted NM since it is easily available, and simple to use and will already work well enough in practice. But again, in theory, MADS should be used instead (for a formal convergence guarantee).
For related works on optimization in systems and control, see the Matlab files http://www.mathworks.com/matlabcentral/fileexchange/33022 and http://www.mathworks.com/matlabcentral/fileexchange/33219 (and hyperlinked papers therein). |