Gains from providing sparsity patterns of derivatives?

3 views (last 30 days)
Hi,
I am solving an optimization problem using fmincon with the interior-point method. The problem has about 5000 variables and 5000 nonlinear constraints. By a very large margin, most of the computational time is spent in the routine 'factorKKTmatrix', which I guess does what its name suggests. As the trust-region version of the interior-point method does not appear to be more efficient, an idea that comes to mind to speed things up is to provide sparsity patterns for the Jacobian of the constraints and the Hessian of the Lagrangian (both are very sparse). To my knowledge, however, this is not possible using fmincon. My question is thus whether I can expect to see significant gains in terms of CPU-time from providing said sparsity patterns, making it worthwhile to try other solvers (eg. knitro) as well?
Kind regards
Arne

Answers (1)

Matt J
Matt J on 9 Oct 2012
There would definitely be significant gains in taking advantage of Hessian sparsity, but I don't know why you think that means abandoning FMINCON. This link talks a lot about options in FMINCON to customize Hessian calculations
The trust-region method offers HessPattern (to specify the sparsity pattern as you mentioned) and HessMult. In HessMult, you can customize multiplication with the Hessian in any way you want, including using a sparse matrix to represent the Hessian.
In the interior-point method, you have both HessMult and HessFcn, which allows similar customizations.
  9 Comments
Arne Vagren
Arne Vagren on 10 Oct 2012
Ok, maybe somebody else out there knows?
Regarding your question it seems that if one does not supply sparsity patterns, knitro treats the Jacobian and Hessian as dense matrices (even if my functions returns them as sparse), making things very slow for anything but very small problems. Unless there is some setting in knitro that I'm missing the comparison you suggest can't really be done.
Steve Grikschat
Steve Grikschat on 12 Nov 2012
Knitro is compiled, while fmincon is written primarily in MATLAB. Managing memory (as you mentioned), and faster calls external libraries such as MA57, tends to lead to faster execution, generally speaking.
The sparsity pattern of the Hessian or Jacobian is not used in the interior-point algorithm of fmincon. For Knitro, it is only used for memory allocation purposes.
Some algorithms, like the trust-region-reflective algorithm in fmincon, use a sparsity pattern to perform sparse finite-differences for Jacobian/Hessian approximations. However, the trust-region algorithm doesn't solve problems with nonlinear constraints.

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!