Using lsqnonlin on large-scale problems - Question about "JacobMult" option

6 views (last 30 days)
Hi,
I am trying to use lsqnonlin() on a very large problem set, in which the Jacobian has a size of 31457280x393216 and is sparse. I am able to calculate the Jacobian myself and pass it to lsqnonlin() using
options = optimoptions('lsqnonlin','Jacobian','on')
The problem is that lsqnonlin() is taking a very long time. I stepped through the code to find where the bottleneck is happening and have tracked it down to the preconditioner step in the trust region algorithm. That is, within the function trdog(), it calls a preconditioner function aprecon() which performs a QR decomposition on the Jacobian. Specificially it is this section of aprecon() which takes an incredibly long time:
p = colamd(TM);
RPCMTX = qr(TM(:,p));
My question is, will using the JacobMult option of lsqnonlin() avoid this? According to the documentation:
For large-scale structured problems, this function computes the Jacobian matrix product J*Y, J'*Y, or J'*(J*Y) without actually forming J.
which to me sounds like it is only useful for large-scale problems in which the Jacobian is too large to even be stored, which is not the issue in my case. Additionally, I have read this: Jacobian Multiply Function with Linear Least Squares, and am confused on what the Jacobian Multiply function is actually doing, and therefore am unsure on how to even implement this.
To summarize, will the JacobMult option avoid the extremely time-consuming QR decomposition in lsqnonlin() for large-scale problems, and can anyone give a clear demonstration on what the JacobMult option actually is doing and how to implement it?
Thanks.

Accepted Answer

Matt J
Matt J on 3 Feb 2014
Edited: Matt J on 3 Feb 2014
If you use JacobMult, I imagine it will try to solve for the Newton steps iteratively, instead of with QR decomposition, which could be faster, depending on how sparse your Jacobian really is.
Basically, the Newton step amounts to solving a linear equation
J*x=F
where J is your Jacobian. There are iterative algorithms which can do this using a series of multiplications with J and J.'. In your case, the JacobMult function could be implemented simply as
function W=jmfun(J,Y,flag)
if flag==0
W = J'*(J*Y);
elseif flag>0
W= J*Y;
else%flag<0
W=J.'*Y;
end
end
  3 Comments
Matt J
Matt J on 3 Feb 2014
Edited: Matt J on 3 Feb 2014
That sounds weird, if it's true. I suppose you could trick aprecon by computing Jinfo as
[m,n]=size(Jacobian);
[i,j,s]=find(Jacobian);
Jinfo={i,j,s,m,n};
and then rebuilding the Jacobian inside jmfun
J=sparse(Jinfo{:});
W=...
You could also perhaps just reshape() the Jacobian into a different sized matrix Jinfo. Undoing that in jmfun might be faster.
In any case, bear in mind that not using a pre-conditioner could end up slowing convergence to a solution, with no clear net gains in speed.
Shayan
Shayan on 3 Feb 2014
Edited: Shayan on 3 Feb 2014
aprecon() still calculates a preconditioner, but it appears to calculate a diagonal preconditioner based on the results of the Jacobian Multiply function instead of QR decomposition. Passing Jinfo as a cell produces an error since lsqnonlin() expects it to be of type double, but I tried out your idea of reshaping the Jacobian and undoing it in jmfun. It seems to work, the iterations are proceeding without getting stuck in QR decomposition.
Thanks!

Sign in to comment.

More Answers (1)

françois anquez
françois anquez on 9 Apr 2020
Dear Shayan, Dear Matt,
I do not wether this will be usefull given the time past since you posted.
I was facing the same problem.
Something that helped for me was to set : 'SubproblemAlgorithm' to 'cg'.
This seems to considerably reduce the time spent in aprecon with similar results.
François.

Categories

Find more on Historical Contests in Help Center and File Exchange

Products

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!