fsolve and GPU Computation

12 views (last 30 days)
Sven
Sven on 16 Sep 2017
Edited: Matt J on 17 Sep 2017
Can fsolve be used using GPU computation? Or can it internally profit from GPU computation?
If not, is it known if it will be made possible to use fsolve with GPU computation?
Is it automatically using GPU on certain conditions?
Or is there a way I could modify fsolve to be able to use it on a GPU?
I am asking because I have multidimensional equations with substantial grids that should supposedly be highly parallelizable. I could try to parallelize within the function which is to be solved, but I suppose it should be more efficient if GPU usage can be established at the fsolve level.
Thank you in advance for any advice.

Answers (1)

Matt J
Matt J on 16 Sep 2017
Edited: Matt J on 16 Sep 2017
I could try to parallelize within the function which is to be solved, but I suppose it should be more efficient if GPU usage can be established at the fsolve level.
No. The greatest benefit will be if you GPU-optimize your objective function and Jacobian calculations. The heavy internal computations done by FSOLVE are mainly linear equation solving and other matrix algebra operations. You can benefit those computations best by computing your Jacobian in sparse form, if applicable.
If you are using the trust region algorithm, then you can also use the 'JacobianMultiplyFcn' option appropriately. You can implement that with your own gpuArray operations, but I think sparsity, where it can be applied, will have more of an impact.
  3 Comments
Matt J
Matt J on 17 Sep 2017
It would be better if fsolve allowed you to return gpuArrays from the objective function code. That way there would be no need to do any CPU-GPU transfers. On the other hand, you would have to have a huge number of equations for the transfer of the objective function vector to significantly slow you down.

Sign in to comment.

Tags

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!