Can fsolve be used using GPU computation? Or can it internally profit from GPU computation?
If not, is it known if it will be made possible to use fsolve with GPU computation?
Is it automatically using GPU on certain conditions?
Or is there a way I could modify fsolve to be able to use it on a GPU?
I am asking because I have multidimensional equations with substantial grids that should supposedly be highly parallelizable. I could try to parallelize within the function which is to be solved, but I suppose it should be more efficient if GPU usage can be established at the fsolve level.
Thank you in advance for any advice.