Rank: 58025 based on 0 downloads (last 30 days) and 0 file submitted
photo

Aydin Buluc

E-mail

Personal Profile:

 

Watch this Author's files

 

Comments and Ratings by Aydin Buluc View all
Updated File Comments Rating
26 Jan 2011 Adaptive Robust Numerical Differentiation Numerical derivative of an analytically supplied function, also gradient, Jacobian & Hessian Author: John D'Errico

Hi John,
Thanks much for the quick response.
jacobianest was indeed what I needed, should have looked into the whole package first.

I was looking at the hessian function, which doesn't allow me to use a vector function (perhaps understandably, as the output would be a tensor otherwise).

Proceeding with the matrix-vector example
@(x) A*x;
Should I loop over the rows of and call hessian m times to get all the Hessians?
for i=1:m
f = @(x) A(i,:)*x; // Returns a scalar
[h(i) herr(i)] = hessian(f, x0);
end

Thanks,
- Aydin

25 Jan 2011 Adaptive Robust Numerical Differentiation Numerical derivative of an analytically supplied function, also gradient, Jacobian & Hessian Author: John D'Errico

Hi John,
Thanks for the lovely tool.
I couldn't figure out how to differentiate multivariable functions with it though. For example,
f = @(x) A*x

where A is a matrix and x is a vector.

Your code seems to do the following:

% Loop over the elements of x0, reducing it to
% a scalar problem. Sorry, vectorization is not
% complete here, but this IS only a single loop.
der = zeros(nx0);
errest = der;
finaldelta = der;
for i = 1:n
...
end

while A*x doesn't really reduce to a single scalar problem as far as I can see. Any suggestions?

Contact us