Path: news.mathworks.com!not-for-mail
From: <HIDDEN>
Newsgroups: comp.soft-sys.matlab
Subject: Re: Non-linear optimization
Date: Wed, 6 Mar 2013 20:00:08 +0000 (UTC)
Organization: Xoran Technologies
Lines: 22
Message-ID: <kh8788$q8e$1@newscl01ah.mathworks.com>
References: <kh2m44$4eh$1@newscl01ah.mathworks.com> <kh2ni9$9ar$1@newscl01ah.mathworks.com> <kh3p76$mhd$1@newscl01ah.mathworks.com> <kh52bl$jbp$1@newscl01ah.mathworks.com> <kh5at7$iv2$1@newscl01ah.mathworks.com> <kh7ljt$oss$1@newscl01ah.mathworks.com> <kh7njk$2io$1@newscl01ah.mathworks.com> <kh84cj$ge4$1@newscl01ah.mathworks.com>
Reply-To: <HIDDEN>
NNTP-Posting-Host: www-03-blr.mathworks.com
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
X-Trace: newscl01ah.mathworks.com 1362600008 26894 172.30.248.48 (6 Mar 2013 20:00:08 GMT)
X-Complaints-To: news@mathworks.com
NNTP-Posting-Date: Wed, 6 Mar 2013 20:00:08 +0000 (UTC)
X-Newsreader: MATLAB Central Newsreader 1440443
Xref: news.mathworks.com comp.soft-sys.matlab:790463

"Toan Cao" <toancv3010@gmail.com> wrote in message <kh84cj$ge4$1@newscl01ah.mathworks.com>...
>
> Hi Matt J,
> For my case, each point has its own transformation described by a rotation matrix and a translation vector.  Actually, my cost function have another term (third term) which constrains the movement of neighbor points to be smooth (equation of third term is complex, it is not easy for me to write with simple texts).
===================

Even with a 3rd smoothing term for neighboring points, it makes no sense that you would apply both a rotation and translation to a point. A translation is enough to specify the movement of a point to a different location. There's no reason to give the movement of the point 6 degrees of freedom when 3 are enough, and redundant parameters would ill-condition the optimization.


> I read a paper in finding minimum of this cost function. It mentioned that it applied Levenberg-Marquardt to obtain the minimum. Following this direction but i am now stuck in finding Jacobian for f(x),  (where  F(x) =f(x)'.f(x) ). 
> I think you are an expert in math, i wish you can give me some suggestions.
=============

All of the cost function terms that you've shown us so far are square terms. Unless the 3rd term is not of this form, it's not clear what problem you have in writing your cost function as F(x) =f(x)'.f(x).

If the third term is not a sum of squares, we need to see it in order to know how to deal with it. I already gave you a suggestion, requiring a lower bound f_low (not necessarily a tight bound). Not showing us the full cost function makes it hard to advise you how to find this lower bound.

Also, as Bruno hinted, Levenberg-Marquardt is actually applicable to any twice differentiable cost function, not just those that are sums of squares. The parameter update step from x_n to x_{n+1} would be a solution to

 (Hessian(x_n) +lambda eye(N) )*x_{n+1}=-gradient(x_n)

However, you would have to code that yourself, including the adaptive selection of the lambda parameter.