From: <HIDDEN>
Newsgroups: comp.soft-sys.matlab
Subject: Re: Quadratic Cost Function x^T Q x
Date: Thu, 20 May 2010 21:47:14 +0000 (UTC)
Organization: Xoran Technologies
Lines: 18
Message-ID: <ht4al2$jcc$>
References: <ht3lnt$92i$> <ht3oj8$gpr$> <ht3tl0$mbi$> <ht3vtl$o1d$> <ht41ak$at$> <ht42pg$8he$> <ht495s$eq2$>
Reply-To: <HIDDEN>
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
X-Trace: 1274392034 19852 (20 May 2010 21:47:14 GMT)
NNTP-Posting-Date: Thu, 20 May 2010 21:47:14 +0000 (UTC)
X-Newsreader: MATLAB Central Newsreader 1440443
Xref: comp.soft-sys.matlab:637824

"Jason" <> wrote in message <ht495s$eq2$>...
> "Roger Stafford" <> wrote in message 

> Think of the vector x as a line in geometrical 2-D space. I want to estimate the parameters of this line, i.e. x(1)*x + x(2)*y + x(3).
> Pardon my ignorance, but even without placing any constraints on x you get a non-trivial solution (i.e. not x = 0).

But the cost function for this estimation problem bears no resemblance to the one you posed.

> My main question is if I can use something different than lsqnonlin which takes a very long time to run and is prone to fall into suboptimal minima.

You might try running lsqnonlin with the Algorithm option to 'levenberg-marquardt'. My intuition about the default trust-region method is that, even if you initialize in the correct capture basin, the algorithm can crawl out of it into something less optimal. Conversely, Levenberg-Marquardt is, according to my intuition, may be more basin-preserving. Also, if you supply an analytical gradient then, according to doc lsqnonlin, it could be more computationally cheap per iteration. 

All of this assumes that you have at least a reasonable guess of the initial solution, but there's no escaping that when it comes to non-convex minimization.