lsqnonlin and backpropagation?

I would like to build a simple neural network for nonlinear regression, but I don't have the NN toolbox. I noticed that the optimization toolbox function lsqnonlin(fun,x0) has an option to use Levenberg-Marquard algorithm. However, I'm not sure what lsqnonlin actually does. If I use lsqnonlin with cost function J (MSE, for example) as the input function, isn't that just a feed-forward network? Or is lsqnonlin actually performing the backpropagation to update the weights? If not, is it possible to use lsqnonlin and perform backpropagation: can I somehow get lsqnonlin to use my backpropagation gradients for the optimization process?

Answers (1)

Matt J
Matt J on 28 Dec 2016
Edited: Matt J on 29 Dec 2016
Or is lsqnonlin actually performing the backpropagation to update the weights?...If not, is it possible to use lsqnonlin and perform backpropagation:
My background is not in neural networks, but if I am not mistaken, backpropogation is NN-terminology for steepest descent? Levenberg-Marquardt is not steepest descent. The internals of what it does are detailed here. I'm not sure that it matters, for practical purposes, whether the optimization algorithm is traditional NN-backpropogation or something else. The bottom line is that lsqnonlin will minimize the cost function and achieve the same thing.
I do think, however, that lsqcurvefit would be slightly more appropriate than lsqnonlin for this problem, because the error is linear in the network output data.
can I somehow get lsqnonlin to use my backpropagation gradients for the optimization process?
The "backpropogation gradients" refers to the Jacobian of the network output with respect to the weights? If so, then you can indeed feed them to lsqnonlin using the 'SpecifyObjectiveGradient' option, or the 'Jacobian' option depending on how recent your MATLAB version.

Asked:

on 28 Dec 2016

Edited:

on 29 Dec 2016

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!