File Exchange

## Optimization Tips and Tricks

version 1.2.0.0 (629 KB) by John D'Errico

### John D'Errico (view profile)

Tips and tricks for use of the optimization toolbox, linear and nonlinear regression.

Updated 25 Apr 2016

Editor's Note: This file was a File Exchange Pick of the Week

New users and old of optimization in MATLAB will find useful tips and tricks in this document, as well as examples one can use as templates for their own problems.
Use this tool by editing the file optimtips.m, then execute blocks of code in cell mode from the editor, or best, publish the file to HTML. Copy and paste also works of course.

Some readers may find this tool valuable if only for the function pleas - a partitioned least squares solver based on lsqnonlin.

This is a work in progress, as I fully expect to add new topics as I think of them or as suggestions are made. Suggestions for topics I've missed are welcome, as are corrections of my probable numerous errors. The topics currently covered are listed below.

Contents
1. Linear regression basics in matlab
2. Polynomial regression models
3. Weighted regression models
4. Robust estimation
5. Ridge regression
6. Transforming a nonlinear problem to linearity
7. Sums of exponentials
8. Poor starting values
9. Before you have a problem
10. Tolerances & stopping criteria
11. Common optimization problems & mistakes
12. Partitioned least squares estimation
13. Errors in variables regression
14. Passing extra information/variables into an optimization
15. Minimizing the sum of absolute deviations
16. Minimize the maximum absolute deviation
17. Batching small problems into large problems
18. Global solutions & domains of attraction
19. Bound constrained problems
20. Inclusive versus exclusive bound constraints
21. Mixed integer/discrete problems
22. Understanding how they work
23. Wrapping an optimizer around quad
24. Graphical tools for understanding sets of nonlinear equations
25. Optimizing non-smooth or stochastic functions
26. Linear equality constraints
27. Sums of squares surfaces and the geometry of a regression
28. Confidence limits on a regression model
29. Confidence limits on the parameters in a nonlinear regression
30. Quadprog example, unrounding a curve
31. R^2
32. Estimation of the parameters of an implicit function
33. Robust fitting schemes
34. Homotopies
35. Orthogonal polynomial regression
36. Potential topics to be added or expanded in the (near) future

Lionel Juillen AF

Jim

### Jim (view profile)

I too was having trouble uncompressing the zip file (on a mac). Applying the default Archive Utility and then trying Stuffit Expander both gave errors. What did work fine for me was to open a Terminal and type "unzip opt_reg_tips.zip" at the command line prompt. Hope that's helpful to others. --Jim

Aurelien

Sarah B

Sarah B

### Sarah B (view profile)

Hello John,
I would really like to go through your work, but the Zip file seems to be broken - error meantioning headings, it does not unzip fully, only license txt is accessible. Could you please mend it?
Thanks,
S

Pei-Ann Lin

Jor Jor

Penning Yu

Stefan

### Stefan (view profile)

Yes, this Zip file seems broken (7-Zip). Opening it in Windows 7 Explorer worked for me...

littleblack knifeking

### littleblack knifeking (view profile)

The zip file seems wrong~ Please check it out!

arnold

### arnold (view profile)

Hi John,

maybe I've missed it but do you explicitly show how to handle errors in x AND y for fitting/regression?
I've been looking for a solution for days, lost the knowledge how to do it properly.

regards
Arnold

Jason Nicholson

### Jason Nicholson (view profile)

Thanks for the examples. I have a suggestion on orthogonal polynomial fitting. Forsythe suggests a way to solve for both the correct orthogonal polynomials to use and the coefficients. The advantage is the full normal equation never has to be solved. Both the orthogonal polynomials and there coefficients are solved. It seems like an excellent method of data fitting.

Forsythe, George E. "Generation and use of orthogonal polynomials for data-fitting with a digital computer." Journal of the Society for Industrial & Applied Mathematics 5.2 (1957): 74-88.

John Booker

John Booker

John D'Errico

### John D'Errico (view profile)

dav - It is too bad you don't like what I've supplied. It is also too bad that you did not read it, as then you might have learned the answer to your question.

dav

dav

dav

### dav (view profile)

how can I get the code on least absolute deviation please?

Raymundo Marcos-Martinez

Max

### Max (view profile)

An incredible resource. Very thoughtful.

Weiyang Zhao

### Weiyang Zhao (view profile)

I just finished a Java implementation of weighted least square regression with equality constraints based on the knowledge shared by you. Thank you, John.

Fang

very good

Jia

Khairul Shafie

### Khairul Shafie (view profile)

great work John. Thanks a lot.

Liam Mescall

### Liam Mescall (view profile)

Superb ! Thanks for the work

AMVR

Eric Diaz

Hydroman S

### Hydroman S (view profile)

Great submission. Thanks John.

I have a question on transforming a nonlinear problem to linearity:

If the nonlinear problem is of this form:

y = a* x^3

can we linearlize it as follows:

log(y) = log(a) + 3 log (x)

HARI KRISHNAN

f

John D'Errico

### John D'Errico (view profile)

I've just submitted a new version, with many repairs done.

Jack Young

### Jack Young (view profile)

On running optimtips_21_36.m I get the following error:

??? In an assignment A(:) = B, the number of elements in A and B
must be the same.

Error in ==> callAllOptimOutputFcns at 12
stop(i) = feval(OutputFcn{i},xOutputfcn,optimValues,state,varargin{:});

Error in ==> fminsearch>callOutputAndPlotFcns at 468
stop = callAllOptimOutputFcns(outputfcn,xOutputfcn,optimValues,state,varargin{:}) || stop;

Error in ==> fminsearch at 203
[xOutputfcn, optimValues, stop] = callOutputAndPlotFcns(outputfcn,plotfcns,v(:,1),xOutputfcn,'init',itercount, ...

Error in ==> optimtips_21_36 at 103
Xfinal = fminsearch(rosen,[-6,4],opts);

Does anyone has an idea where is comes from?

Jonas

### Jonas (view profile)

Should have rated this as 5 a long time ago. This is a most excellent resource, and pleas.m has helped me tremendously.

michael scheinfeild

Nitin

Eric Diaz

### Eric Diaz (view profile)

Hasn't updated since 2006, despite having people tell him that there are bugs in the code.

One bug which I have reported to him by email is in the pleas wrapper function, when using multiple exponentials.

Other than that, it is a great code with great examples and explanations.

MOHD

2 thumbs up!

Danila

### Danila (view profile)

Very nice and thorough compilation of tips and tricks.

Shaun

### Shaun (view profile)

Hi John,

As pointed out by Eric, I guess, for newer versions, you need an update.

Shaun

function stop = optimplot(x, optimValues, state)
% plots the current point of a 2-d otimization
stop = false;
hold on;
plot(x(1),x(2),'.');
drawnow

Jan Gläscher

### Jan Gläscher (view profile)

Excellent resource. So very useful.

Eric

### Eric (view profile)

One small bug that prevents optimtips.m from running completely, e.g., when publishing optimtips.m

Change line 3 inj optimplot.m from
stop = [];
to
stop = false;

Ben Steiner

### Ben Steiner (view profile)

Echoing the other posts here. this is an excellent intro to optimization in general and matlab capabilities in particular. Thanks John

Ben Steiner

### Ben Steiner (view profile)

Ida Westerberg

Super! Just the help that what I was looking for.

A B

5

jugmendra singh

Bravo Merci beaucoup

pravin katre

good

TULISHETTI SRINIVAS

Björn Wurst

I need robust regression methods in my diploma thesis and this work gives a verry good first impression of regression in matlab.

Annamnaidu S

b q

zuduo zheng

Good Job!!!

Sergei Koulayev

John,

I loved this the most:
"% Likewise, reducing the value of TolFun need not reduce the error
% of the fit. If an optimizer has converged to its global optimum,
% reducing these tolerances cannot produce a better fit. Blood cannot
% be obtained from a rock, no matter how hard one squeezes. The rock
% may become bloody, but the blood came from your own hand."

hippo man

I am thai,who love Matlab.thank a lot.

Varun Sakalkar

Nice work!!

Hua Yang

Thanks very much!

ponthep veng

Good thank you
From thailand

Nair SUBRA

thank you

Jorge Martinez

Ok.

John D'Errico

I'll see if I can do something with stochastic optimizers. It is a topic I apparently forgot to cover. Of course, the GADS toolbox is available for genetic algorithms. Please check back in a week or two.
John

thank you

John, could you talk more about simulated annealing and other similiar optimization techniques? Or write a general function as you have done for gridfit. Thank you. I always learnt very much from you.

Vishnuvenkatesh Dhage

very useful

Garrett Barter

Great work! I found the linprog examples for L1 and L_infty regression quite helpful.

kimi raikkonen

wilmer salazar trujillo

Deben promoverla con mayor intensidad en centros educativos desde primeros niveles

Abdimaged Mussa

not perfet though, it has usefull informations, but not many to explore,
overall its a good website

It is very helpful for me to solve my work

Tie Ling

This package is very useful for me. It is excellent. Thank you for your help!

Suman Banerjee

Sir, it's a Excellent package. You should publish a book and please make sure that general students like me from India can buy it. Thnks for helping.

thank you

extremely useful

Sung SOo Kim

This is an excellent package.
Thank you so much.

sione palu

Excellent package.

Wang Qiwen

Very good

Taghi Miri

Thank you, i found it very useful

Peter Krug

A must read to beginners like myself. Great work that really helps - not like to on-line help of Matlab.

21st Jocobi

Anthony Clark

An excellent reource. You should publish this as a book, it would be a valuable resource for post graduates and carrer professionals! Really improved my routines by awnsering a lot of technical questions about using the optim toolbox (generally not covered in help or other books more general to the subject area). THANKYOU!

Kaushik b