Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
Separable (VarPro) Nonlinear Least Squares, LSQCURVEFIT, Jacobians and Confidence Intervals

Subject: Separable (VarPro) Nonlinear Least Squares, LSQCURVEFIT, Jacobians and Confidence Intervals

From: Eric Diaz

Date: 24 Apr, 2010 16:36:05

Message: 1 of 19

Dear Matlab User Community,

The mathworks has posted a simple example of how to perform separable nonlinear least squares using the LSQCURVEFIT function.

http://www.mathworks.com/support/solutions/en/data/1-18E03/index.html?product=OP&solution=1-18E03

This is a good solution, in the sense that it provides a simple separable lsqcurvefit setup for a simple mono-exponential function such as, y = A.*exp(B.*XDATA)

However, what if the user would like to apply the user-defined jacobian optimset by returning [y J] as the output for the function LSQCURVEFIT is using? The given example returns linear coefficients as the second output term of the function (as opposed to the jacobian) and thus confuses the user?

In addition, what if the user would like to use jacobian information for both parameters, but runs into the problem of the fact that the separable equation is only optimized by LSQCURVEFIT around 1 parameter, not all of them? Can the linear regression use the jacobian information provided for the A parameter? Can confidence intervals of the A parameter be computed? Can nlpredci still be used?

These are issues that I am concerned with but have not been able to come up with any satisfactory solution yet.

Any discourse on these topics would be greatly appreciated and of practical value to me. Thank you in advance!

Eric Diaz

Subject: Separable (VarPro) Nonlinear Least Squares, LSQCURVEFIT, Jacobians and Confidence Intervals

From: John D'Errico

Date: 24 Apr, 2010 18:51:04

Message: 2 of 19

"Eric Diaz" <eric.diaz@gmail.com> wrote in message <hqv6ll$ltj$1@fred.mathworks.com>...
> Dear Matlab User Community,
>
> The mathworks has posted a simple example of how to perform separable nonlinear least squares using the LSQCURVEFIT function.
>
> http://www.mathworks.com/support/solutions/en/data/1-18E03/index.html?product=OP&solution=1-18E03
>
> This is a good solution, in the sense that it provides a simple separable lsqcurvefit setup for a simple mono-exponential function such as, y = A.*exp(B.*XDATA)
>
> However, what if the user would like to apply the user-defined jacobian optimset by returning [y J] as the output for the function LSQCURVEFIT is using? The given example returns linear coefficients as the second output term of the function (as opposed to the jacobian) and thus confuses the user?
>
> In addition, what if the user would like to use jacobian information for both parameters, but runs into the problem of the fact that the separable equation is only optimized by LSQCURVEFIT around 1 parameter, not all of them? Can the linear regression use the jacobian information provided for the A parameter? Can confidence intervals of the A parameter be computed? Can nlpredci still be used?
>
> These are issues that I am concerned with but have not been able to come up with any satisfactory solution yet.
>
> Any discourse on these topics would be greatly appreciated and of practical value to me. Thank you in advance!
>
> Eric Diaz

Yes, you can get confidence intervals, but it will
take some work on your part.

I would suggest the best way is to do it is to create
the Jacobian matrix for all of the parameters. This
is easily done here, or for any such problem. Then
form estimates of the parameter variances as I teach
how to do in my optimization tips and tricks.

John

Subject: Separable (VarPro) Nonlinear Least Squares, LSQCURVEFIT, Jacobians and Confidence Intervals

From: Eric Diaz

Date: 25 Apr, 2010 02:38:06

Message: 3 of 19

"John D'Errico" <woodchips@rochester.rr.com> wrote in message
>
> Yes, you can get confidence intervals, but it will
> take some work on your part.
>
> I would suggest the best way is to do it is to create
> the Jacobian matrix for all of the parameters. This
> is easily done here, or for any such problem. Then
> form estimates of the parameter variances as I teach
> how to do in my optimization tips and tricks.
>
> John

Dear John,

Thank you for your reply. I know that you are an expert in this area. I already have the jacobian matrix for all the parameters and am currently using [y J] for the output of my function.

The way that I currently compute my confidence intervals is with nlpredci, but I also compute my standard errors as the sqrt(diag(covariance)), where covariance = (J'*J)./mse

That being said, my goal was to enhance my current approach by using variable projection. Yet, it seems to be confounded by the questions I originally posted.

If there is any way you could be a bit more concrete, I would appreciate it.
I have provided a setup below, for which I would like to implement separable nonlinear least squares (similar to the example), while still providing the jacobians to lsqcurvefit and being able to predict the confidence intervals.

Setup:

p0 = [A0 B0]; %define these yourself
lb = [0 0];
ub = [inf inf]]

options = optimset( 'Display' , 'none','Diagnostics','off',...
                    'MaxIter' , 10,'MaxFunEvals', 500,...
                    'TolFun' , 1e-12,'TolX' ,1e-12,...
                    'Jacobian' , 'on','LargeScale','on');

[parEst,sse,res,dummyVar,dummyVar,dummyVar,J] = lsqcurvefit(@myfunc,p0,xdata,ydata,lb,ub,options);

% estimate standard error
DOF = numel(xdata)-numel(p0); % Degrees of Freedom

function [mse, cov, parSE] = estimateSE(DOF,sse,J);
% This function is called after lsqcurvefit
mse = sse/(DOF); % Mean square error
cov = (J'*J)./mse; % Estimated covariance matrix of coeff estim
parSE = sqrt(abs(diag(cov)));
end

% This section predicts the CI for the nonlinear fit
time = 0:0.01:xdata(end);
FitCurve = myfunc(parEst,time);
[ypred, delta] = nlpredci(@myfunc,time,parEst,res,J);


function [Sig Jacob] = myfunc(p,xdata)
% This function is called by lsqcurvefit.
% x is a vector which contains the coefficients of the
% equation. xdata is the data set
% passed to lscurvefit.

A = p(1);
B = p(2);

Sig = A.*exp(-xdata./B);
Jp1 = exp(-xdata./B);
Jp2 = A.*(xdata./(B.^2).*exp(-xdata./B);
Jacob = [Jp1; Jp2]';
end

Thank you again in advance,

Eric

Subject: Separable (VarPro) Nonlinear Least Squares, LSQCURVEFIT, Jacobians and Confidence Intervals

From: Eric Diaz

Date: 25 Apr, 2010 17:39:06

Message: 4 of 19

Bump...geez there has been a lot of spam on here! I hope that something can be done to these spammers and about the spam, my posts are getting lost in the mix...

Subject: Separable (VarPro) Nonlinear Least Squares, LSQCURVEFIT, Jacobians and Confidence Intervals

From: Eric Diaz

Date: 25 Apr, 2010 20:44:04

Message: 5 of 19

Bumpity bump bump...wow, I thought more people would be interested in this topic. John's answer was woefully too vague and inadequate. Anyone else interested, i.e., TMW?

Subject: Separable (VarPro) Nonlinear Least Squares, LSQCURVEFIT, Jacobians and Confidence Intervals

From: Eric Diaz

Date: 26 Apr, 2010 00:10:21

Message: 6 of 19

Bump

Subject: Separable (VarPro) Nonlinear Least Squares, LSQCURVEFIT, Jacobians

From: Walter Roberson

Date: 26 Apr, 2010 01:20:04

Message: 7 of 19

Eric Diaz wrote:
 > Bump...geez there has been a lot of spam on here!

Yes, we've been getting a bunch of spam about Separable Nonlinear Least
Squares.

 > I hope that something can be done to these spammers and about the
 > spam, my posts are getting lost in the mix...

Your posts are not getting lost in the mix.

> Bumpity bump bump...wow, I thought more people would be interested in
> this topic.

You picked a weekend. Have you ever done a historical analysis of who
tends to answer posts on weekends, and of what kinds of posts that
person tends to answer?

The people who tend to answer more than the occasional weekend post,
tend to include myself, ImageAnalyst, John D'Errico, TideMan, Greg
Heath, and Urs; Rune, Bruno, dbp, and Roger Stafford often show up on
weekends, but with perceptibly reduced posting volume compared to the
weekday.

Of those, the ones who tend to deal with issues such as your present
topic are John and occasionally Roger (if the question happens to spark
his interest; he is more likely to deal with geometry and algebra
questions.) TideMan has some fitting experience as well, but prefers to
concentrate on varieties of fitting quite different from what you raise,
such as fitting functions to data or fitting with wavelets.

 > John's answer was woefully too vague and inadequate.

It is rumoured that John has a Time-Share on a Life, that he sometimes
uses on weekends. I send off for a Life from Nigeria, but it's been
caught in paperwork and various customs fees for the last 8 years, and I
don't know when it will arrive.

> Anyone else interested, i.e., TMW?

If you are expecting TMW to respond, then start a support case with TMW.
Any Mathworks employees that visit here do so as volunteers.

Subject: Separable (VarPro) Nonlinear Least Squares, LSQCURVEFIT, Jacobians

From: Eric Diaz

Date: 26 Apr, 2010 02:28:06

Message: 8 of 19

Dear Walter,

Respectfully, while you may offer some great advice on occasion, I beg your pardon. You need to go shove your spam comment about my serious post up your's. This forum does not belong to you, nor does it make you important, nor can you go around harassing people about their posts.

As for your comment about it being a weekend post and thus low expectations should be the norm, I think it absurd! Have you ever counted how many posts go onto the forum on a weekend? There have been over 140 since Saturday morning and over 240 since Friday morning.

As for John's comment, I stand by the fact that it was woefully vague and inadequate. 1st he says that it is possible but that it will require work...uh...wow, thanks captain obvious. 2nd, he provides another suggestion to create jacobians for all the parameters...uh...thanks again...I actually said that I wanted to apply user defined Jacobians... another point for captain obvious. 3rd, he points me to his optim:tips and tricks, (which I might add hasn't been updated in over 3 years, despite several posts and emails about errors) and yet he doesn't provide any specifics about where to look. Not that I would particularly want to look at something that hasn't been updated in >3 years. In closing, I believe I posted some specific questions or concerns, of which none were addressed.

So, the moral: if your going to try and help, don't do a half a** job at it, or your going to get comments about woeful inadequacy. I never implied that John didn't have a life, but I do take much offense at the induction that I don't. So go shove it again!

As for TMW being interested, I would hope that you are not so naive as to think that they don't peruse this forum, as volunteers and as a source of R&D.

So, walter, my message to you is to stop hating! This is supposed to be a serious question and a community forum, not run by bullies like you.

Subject: Separable (VarPro) Nonlinear Least Squares, LSQCURVEFIT, Jacobians

From: Steven Lord

Date: 26 Apr, 2010 04:28:41

Message: 9 of 19


"Eric Diaz" <eric.diaz@gmail.com> wrote in message
news:hr2tnm$q9u$1@fred.mathworks.com...
> Dear Walter,
>
> Respectfully, while you may offer some great advice on occasion, I beg
> your pardon. You need to go shove your spam comment about my serious post
> up your's. This forum does not belong to you, nor does it make you
> important, nor can you go around harassing people about their posts.
> As for your comment about it being a weekend post and thus low
> expectations should be the norm, I think it absurd! Have you ever counted
> how many posts go onto the forum on a weekend? There have been over 140
> since Saturday morning and over 240 since Friday morning.

Compare that with the volume for the average weekday and you'll see that's
relatively light traffic for CSSM.

> As for John's comment, I stand by the fact that it was woefully vague and
> inadequate. 1st he says that it is possible but that it will require
> work...uh...wow, thanks captain obvious. 2nd, he provides another
> suggestion to create jacobians for all the parameters...uh...thanks
> again...I actually said that I wanted to apply user defined Jacobians...
> another point for captain obvious. 3rd, he points me to his optim:tips
> and tricks, (which I might add hasn't been updated in over 3 years,
> despite several posts and emails about errors) and yet he doesn't provide
> any specifics about where to look. Not that I would particularly want to
> look at something that hasn't been updated in >3 years. In closing, I
> believe I posted some specific questions or concerns, of which none were
> addressed.

I can't speak for John, but his message was posted on a Saturday afternoon,
and at least here in Natick it was a pretty nice afternoon. Perhaps John
dashed off his message as a quick response before heading out to enjoy it.

> So, the moral: if your going to try and help, don't do a half a** job at
> it, or your going to get comments about woeful inadequacy. I never
> implied that John didn't have a life, but I do take much offense at the
> induction that I don't. So go shove it again!

John _volunteers_ his time and experience here. He can give whatever help
he sees fit to whomever he sees fit; if you require him to give you specific
advice, I imagine he may be open to doing a bit of consulting work.

But just a quick bit of advice? In general, people tend to be more willing
to help those people they like. Posting messages like this telling a
well-known and respected member of the community to "shove it again"?
Probably not helping your likeability.

If you're going to reply, I ask you to please do something first. Write
your response, walk away and do something else for 5 minutes, then come back
and reread it as though someone else were sending it to you. Also please
contemplate the fact that not all messages that are written for this
newsgroup necessarily need to get posted to the newsgroup -- on more than
one occasion I've written messages that I then delete unsent after a reread.


Now, going back to your original question about this technical solution
document:

http://www.mathworks.com/support/solutions/en/data/1-18E03/index.html?product=OP&solution=1-18E03

If you have a question about it, or a suggestion for a way it can be
improved or enhanced, please use the link that reads "Please provide
feedback to help us improve this Solution" at the bottom of the page. That
will allow you to provide feedback directly to the technical support staff
responsible for maintaining that document, so that they can incorporate it
into that document.

--
Steve Lord
slord@mathworks.com
comp.soft-sys.matlab (CSSM) FAQ: http://matlabwiki.mathworks.com/MATLAB_FAQ

Subject: Separable (VarPro) Nonlinear Least Squares, LSQCURVEFIT, Jacobians

From: Bruno Luong

Date: 26 Apr, 2010 06:49:05

Message: 10 of 19

"Eric Diaz" <eric.diaz@gmail.com> wrote in message <hr2tnm$q9u$1@fred.mathworks.com>...
>
>
> As for John's comment, I stand by the fact that it was woefully vague and inadequate. 1st he says that it is possible but that it will require work...uh...wow, thanks captain obvious. 2nd, he provides another suggestion to create jacobians for all the parameters...uh...thanks again...I actually said that I wanted to apply user defined Jacobians... another point for captain obvious. 3rd, he points me to his optim:tips and tricks, (which I might add hasn't been updated in over 3 years, despite several posts and emails about errors) and yet he doesn't provide any specifics about where to look. Not that I would particularly want to look at something that hasn't been updated in >3 years. In closing, I believe I posted some specific questions or concerns, of which none were addressed.

Whoaa, it's tough way to acknowledge someone who actually even read READ your post. John answer is not completed but he might just go outside and enjoy a sunny day as Steve pointed out. After people who helps here are all volunteers.

I have also few knowledge to be able to chim-in on this kind of sensitivity analysis and linear separable problems, and I could even point to you a good PhD thesis in the 80s exclusive on this topic. I have not yet replied in this post because your request is very vague (we don't what you want to improve and what you have tried). John reply is vague but actually in part with the question that has been laid out.

Bruno

Subject: Separable (VarPro) Nonlinear Least Squares, LSQCURVEFIT, Jacobians

From: Eric Diaz

Date: 26 Apr, 2010 06:51:05

Message: 11 of 19

Dear Steven,

Do you think that I sit around and count the volume of traffic on the CSSM?

I don't think John provided any new or helpful information and he was vague. He would have been better off just enjoying the saturday afternoon, if that was what he was doing, rather than posting a quick but useless response. I don't require any specific advice from him but since he clearly is interested in the topic, he can certainly give it if he so desires. There is no point in just repeating what I stated in my question and being vague though.

I think your advice may be a bit biased...obviously, you side with someone who you claim is "well known and respected", but I ask, well known and respected by whom? Certainly not be me! I don't know this guy and I'm not going to let him call my serious and intellectually challenging post spam! The only reason I was bumping was because there was a lot of spam on the forum. I would hope that TMW wouldn't want "well known and respected" people from discouraging regular, everyday matlab users from posting to the community forum by excessive cheekiness. I've seen it all too often.

It's unfortunate that TMW hasn't already gotten back to the feedback I provided on Friday. Maybe your technical support staff haven't checked their feedback yet... you should let someone know?

Eric Diaz

Subject: Separable (VarPro) Nonlinear Least Squares, LSQCURVEFIT, Jacobians

From: Bruno Luong

Date: 26 Apr, 2010 07:04:09

Message: 12 of 19

"Eric Diaz" <eric.diaz@gmail.com> wrote in message <hr3d4p$jps$1@fred.mathworks.com>...
>
> I don't think John provided any new or helpful information and he was vague.

But at least he clearly open the door for further discussion. You could for example expose what you have tried, why you are not satisfied with the full Jacobian sensitivity analysis, and explained what exactly you want to improve. Instead of that, your posts turn about "bump, bump and bump". So now the discussion inevitably turns around the "bump" topic.

Bruno

Subject: Separable (VarPro) Nonlinear Least Squares, LSQCURVEFIT, Jacobians

From: Eric Diaz

Date: 26 Apr, 2010 15:43:04

Message: 13 of 19

Dear Bruno,

Clearly, your thoughts are biased as well. There is no turning about a bump...that is preposterous. The purpose of a bump is well known by all; It is to refresh a posts page status (i.e. back to page 1) on a forum which is very active. There was a great deal of religious and merchandise spam on the forum, and thus the reason for bumping. It was unfortunate that someone was tasteless enough to call my post spam for bumping.

I further disagree about John leaving the door open...if by being vague you mean... then of course he begs the question or interpretation, rather than actually being helpful. The only logical question that should have followed is "where exactly in your optim tips: did you discuss how to create confidence intervals for this specific problem?" Which of course, the answer to would be nowhere because this specific problem has not been addressed.

I believe my original post is clear and my questions are clear. I further posted up an specific example setup where I stated exactly what I would like to change with respect to that setup using variable projection. If you don't get it then I'm not sure what to do.

As to your question, I like to use nlpredci's delta to plot finely spaced confidence intervals around my fit curve. I tried looking at the nlpredci code but it was a bit jumbled. Otherwise, if I could compute the delta another way, I would be perfectly satisfied with my sensitivity analysis.

Eric Diaz

Subject: Separable (VarPro) Nonlinear Least Squares, LSQCURVEFIT, Jacobians

From: Bruno Luong

Date: 26 Apr, 2010 16:58:04

Message: 14 of 19

"Eric Diaz" <eric.diaz@gmail.com> wrote in message <hr4ca8$m9d$1@fred.mathworks.com>...
> Clearly, your thoughts are biased as well. There is no turning about a bump...that is preposterous. The purpose of a bump is well known by all; It is to refresh a posts page status (i.e. back to page 1) on a forum which is very active.
>There was a great deal of religious and merchandise spam on the forum,
> and thus the reason for bumping.

Correct me if I'm wrong but Eric:
- A "bump" is mainly used to tell "I'm not happy my request is ignored", isn't it?
- Can you imagine a mess created if every real thread would get a "bump refresh" everytime it get superposed by a spam post?

You are the first one who used "bump" in *that* purpose, that might be missinterpreted by many of us, myself included.

But honestly I don't think I missinterpreted: you obviously get frustrated that John's answer did not meet your expectation - as you have expresssed later, not only because of the string of spams.

> It was unfortunate that someone was tasteless enough to call my post spam for bumping.

The unfortunate tagging might be just a mistake, not intentionaly. It happens to me to tag inadvertant spam a true post.

>
> I further disagree about John leaving the door open...if by being vague you mean... then of course he begs the question or interpretation, rather than actually being helpful. The only logical question that should have followed is "where exactly in your optim tips: did you discuss how to create confidence intervals for this specific problem?" Which of course, the answer to would be nowhere because this specific problem has not been addressed.
>
> I believe my original post is clear and my questions are clear. I further posted up an specific example setup where I stated exactly what I would like to change with respect to that setup using variable projection. If you don't get it then I'm not sure what to do.

Personaly I did not (and still do no) get the GOAL you want to reach with the description. It seems to me you want to feed the full Jacobian somewhere in a specific method of solving least-square linear separable. But you have not explained clearly "why".

>
> As to your question, I like to use nlpredci's delta to plot finely spaced confidence intervals around my fit curve. I tried looking at the nlpredci code but it was a bit jumbled. Otherwise, if I could compute the delta another way, I would be perfectly satisfied with my sensitivity analysis.

I do not have the stat toolbox, so I can't comment.

Bruno

Subject: Separable (VarPro) Nonlinear Least Squares, LSQCURVEFIT, Jacobians

From: Walter Roberson

Date: 26 Apr, 2010 17:25:18

Message: 15 of 19

Eric Diaz wrote:
> Dear Walter,

 > This forum does not belong to you, nor does it make you
> important, nor can you go around harassing people about their posts.

The wording of Eric Diaz's sentence, although semantically null in its
detailed phrasing, could have been reasonably _interpreted_ by some
people as being an indication that I have been "harassing" people, or
that Eric Diaz believed or believes I have been "harassing" people.

"Harassment" is a crime in Canada, a violation of Canada Criminal Code
section 264,
http://laws.justice.gc.ca/eng/C-46/20100425/page-6.html#codese:264 .

I do not believe that I have "harassed" anyone at any time within the
meaning of Canadian law -- a law that as a principle component requires
behaviour "that causes that other person reasonably, in all the
circumstances, to fear for their safety or the safety of anyone known to
them."

I have, though, engaged in what I believe to be "Fair comment on public
person or work of art", section 310(a),
http://laws.justice.gc.ca/eng/C-46/20100425/page-6.html#codese:310 .


Note: this response is not issued in contravention of section 264(2)(b),
but rather is published under the authority of Canada Criminal Code
section 312(b) "Publication invited or necessary"
http://laws.justice.gc.ca/eng/C-46/20100425/page-6.html#codese:312


I would suggest that correspondence (if any) on this matter proceed
through email, rather than in public.

    Walter Roberson

Subject: Separable (VarPro) Nonlinear Least Squares, LSQCURVEFIT, Jacobians

From: Eric Diaz

Date: 26 Apr, 2010 18:48:06

Message: 16 of 19

Dear Bruno,

> Correct me if I'm wrong but Eric:
> - A "bump" is mainly used to tell "I'm not happy my request is ignored", isn't it?
> - Can you imagine a mess created if every real thread would get a "bump refresh" everytime it get superposed by a spam post?

You are wrong Bruno. Consider that my correction. It was used in the purpose that I previously stated and is more commonly used in that sense than as a way to whine about posts being ignored.

I was frustrated with the lack of usefullness of John's comment but I endeavored to reach a more concrete explanation by providing an example. Unfortunately, the entirety of this post's main concern was sidetracked by some cheeky comments.

I believe you misunderstood the spam comment. It wasn't that somebody accidentally mistagged my post as spam, it was that walter called my post spam in his reply. Which I took offense to.

> Personaly I did not (and still do no) get the GOAL you want to reach with the description. It seems to me you want to feed the full Jacobian somewhere in a specific method of solving least-square linear separable. But you have not explained clearly "why".
>
Why I would like to feed in the jacobian into my least squares nonlinear fit, should be obvious, e.g. fewer iterations. The variable projection strategy in the exponential case reduces the computational cost of a nonlinear fit of multiple parameters and perhaps also the number of iterations, while providing a similar if not smaller residual norm. Making it usually a more efficient and robust technique.

Eric Diaz

Subject: Separable (VarPro) Nonlinear Least Squares, LSQCURVEFIT, Jacobians

From: Bruno Luong

Date: 26 Apr, 2010 20:25:21

Message: 17 of 19

"Eric Diaz" <eric.diaz@gmail.com> wrote in message <hr4n56$l34$1@fred.mathworks.com>...

> Why I would like to feed in the jacobian into my least squares nonlinear fit, should be obvious, e.g. fewer iterations. The variable projection strategy in the exponential case reduces the computational cost of a nonlinear fit of multiple parameters and perhaps also the number of iterations, while providing a similar if not smaller residual norm. Making it usually a more efficient and robust technique.
>

It is still not entirely clear to me what is the issue.

1- Are you asking how to compute the Jacobian of the non-linear projection? It is obvious that the Jacobian where users must provide is the derivative of the model with respect to the optimized parameter(s), and nothing more.
2- If yes, do you know how to compute the Jacobian in 1 above?
3- If no, do you know how to compute the Jacobian of the original problem (i.e., without projection, I guess the answer is yes that this point from what I read)?
4- If yes are you able to compute 2 from 3?
5- If no have you heard about Golub/Pereyra formula?
6- If no try to get this paper: "G. H. Golub and V. Pereyra, The differentiation of pseudoinverses and nonlinear least squares problems whose variables separate, SIAM J. Numer. Anal., 10 (1973), pp. 413–432.'

Bruno

Subject: Separable (VarPro) Nonlinear Least Squares, LSQCURVEFIT, Jacobians

From: Eric Diaz

Date: 26 Apr, 2010 21:09:04

Message: 18 of 19

Dear Bruno,

> 1- Are you asking how to compute the Jacobian of the non-linear projection? It is obvious that the Jacobian where users must provide is the derivative of the model with respect to the optimized parameter(s), and nothing more.
>>
In the example setup, the function called by lsqcurvefit returns my yhat and Jacobian, without the use of separation.

Now, if I were to implement the same approach as given by Matlab in their technical solution document, then there is confusion.

The confusion lies wherein
the example they give has linear coefficients as the 2nd output, and my function has the Jacobian as the 2nd output. If I implement backslash for the linear terms, then would I have 3 outputs for my function?

Using the setup example: would I get?

function [Sig, lincoeff, Jacob] = myfuncSep(p,xdata,ydata)
% This function is called by lsqcurvefit.
% p is a vector which contains the coefficients of the
% equation. xdata is the data set passed to lsqcurvefit.

B=p;

% Separately solve for the linear coefficients
systemeq =[exp(-xdata (:)/B)];
lincoeff = systemeq\ydata(:);
A=lincoeff(1);

Sig = A.*exp(-xdata./B);
Jp1 = exp(-xdata./B);
Jp2 = A.*(xdata./(B.^2).*exp(-xdata./B);
Jacob = [Jp1; Jp2]';
end


> 2- If yes, do you know how to compute the Jacobian in 1 above?
>>Isn't it just as you mention in 1), where it is the derivative of the model with respect to the optimized parameters?

> 3- If no, do you know how to compute the Jacobian of the original problem (i.e., without projection, I guess the answer is yes that this point from what I read)?
>>Yes

> 4- If yes are you able to compute 2 from 3?
>>Well, I would assume so, but perhaps my assumption is incorrect.

> 5- If no have you heard about Golub/Pereyra formula?
>>Yes, however, not from the original paper. I have been looking at a more recent publication by Osborne 2007 "separable least squares, variable projection, and the gauss-newton algorithm" dedicated to Golub for his 75th birthday.



> 6- If no try to get this paper: "G. H. Golub and V. Pereyra, The differentiation of pseudoinverses and nonlinear least squares problems whose variables separate, SIAM J. Numer. Anal., 10 (1973), pp. 413–432.'
>>I will look into it

Thanks for your interest and pointers,

Regards,

Eric Diaz

Subject: Separable (VarPro) Nonlinear Least Squares, LSQCURVEFIT, Jacobians

From: Bruno Luong

Date: 26 Apr, 2010 21:39:04

Message: 19 of 19

"Eric Diaz" <eric.diaz@gmail.com> wrote in message <hr4vdg$h41$1@fred.mathworks.com>...

>
> Using the setup example: would I get?
>
> function [Sig, lincoeff, Jacob] = myfuncSep(p,xdata,ydata)
> % This function is called by lsqcurvefit.
> % p is a vector which contains the coefficients of the
> % equation. xdata is the data set passed to lsqcurvefit.
>
> B=p;
>
> % Separately solve for the linear coefficients
> systemeq =[exp(-xdata (:)/B)];
> lincoeff = systemeq\ydata(:);
> A=lincoeff(1);
>
> Sig = A.*exp(-xdata./B);
> Jp1 = exp(-xdata./B);
> Jp2 = A.*(xdata./(B.^2).*exp(-xdata./B);
> Jacob = [Jp1; Jp2]';
> end

I have no idea why they return lincoeff in the second output, but obviously as they do not turn on the Jacobian option, it does not interfere. In you case, if you turn the Jacobian option "on", you should return the Jacobian as second output. In this specific example Jacobian is just a column vector, since there is only one (1) parameter.

>
>
> > 2- If yes, do you know how to compute the Jacobian in 1 above?
> >>Isn't it just as you mention in 1), where it is the derivative of the model with respect to the optimized parameters?

Yes,

>
> > 3- If no, do you know how to compute the Jacobian of the original problem (i.e., without projection, I guess the answer is yes that this point from what I read)?
> >>Yes
>
> > 4- If yes are you able to compute 2 from 3?
> >>Well, I would assume so, but perhaps my assumption is incorrect.

You can, by the Golub/Pereyra formula. In some cases, this formula is too complicated for practical use, and people often prefer making an approximation of the Jacobian by finite difference. I believe this is more or less equivalent to turning the Jacobian option 'off' and let the minimizer in charge of the estimation of the gradient.

>
> > 5- If no have you heard about Golub/Pereyra formula?
> >>Yes, however, not from the original paper. I have been looking at a more recent publication by Osborne 2007 "separable least squares, variable projection, and the gauss-newton algorithm" dedicated to Golub for his 75th birthday.

Yeap, there are a tone of papers on this topic, especially when the linear-separable problems come up, GP's formula and GN's algorithm are often used. I have used it in my Free-knot spline FEX submission.
 
Good luck,

Bruno

Tags for this Thread

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us