Path: news.mathworks.com!not-for-mail
From: <HIDDEN>
Newsgroups: comp.soft-sys.matlab
Subject: Re: Quadratic Cost Function x^T Q x
Date: Sat, 22 May 2010 23:42:04 +0000 (UTC)
Organization: Imperial College London
Lines: 79
Message-ID: <ht9q4c$jbj$1@fred.mathworks.com>
References: <ht3lnt$92i$1@fred.mathworks.com> <ht3oj8$gpr$1@fred.mathworks.com> <ht3tl0$mbi$1@fred.mathworks.com> <ht3vtl$o1d$1@fred.mathworks.com> <ht41ak$at$1@fred.mathworks.com> <ht42pg$8he$1@fred.mathworks.com> <ht495s$eq2$1@fred.mathworks.com> <ht4al2$jcc$1@fred.mathworks.com> <ht4bq9$4t9$1@fred.mathworks.com> <ht578a$n72$1@fred.mathworks.com> <ht5ngv$mot$1@fred.mathworks.com> <ht5pkg$8o4$1@fred.mathworks.com> <ht5qeo$26u$1@fred.mathworks.com> <ht5si9$ipb$1@fred.mathworks.com> <ht624t$p6b$1@fred.mathworks.com> <ht68q6$mcu$1@fred.mathworks.com> <ht8tug$sjj$1@fred.mathworks.com>
Reply-To: <HIDDEN>
NNTP-Posting-Host: webapp-02-blr.mathworks.com
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
X-Trace: fred.mathworks.com 1274571724 19827 172.30.248.37 (22 May 2010 23:42:04 GMT)
X-Complaints-To: news@mathworks.com
NNTP-Posting-Date: Sat, 22 May 2010 23:42:04 +0000 (UTC)
X-Newsreader: MATLAB Central Newsreader 1192337
Xref: news.mathworks.com comp.soft-sys.matlab:638391

"Matt J " <mattjacREMOVE@THISieee.spam> wrote in message <ht8tug$sjj$1@fred.mathworks.com>...
> "Jason" <jf203@ic.ac.uk> wrote in message <ht68q6$mcu$1@fred.mathworks.com>...
> 
> > Well it depends on how you define your cost function. lsqnonlin will automatically sum of squares. other methods might not.
> =====================
> 
> That's true, but the cost function you've defined here, and the only one we've been talking about in this thread, is the one used by lsqnonlin. This is for good reason. It would be highly non-standard and take considerable ingenuity to find a simpler cost function for this root-finding problem than what lsqnonlin uses. 
> 
>  
> > In any case, Bruno, I wanted to ask you something else.
> ==============
> 
> Not Bruno. Me.
> 
> 
> > I did what you said, i.e. setting x_3 = 1 and x_3 = 0 and then compute two different cost functions.
> >
> > Also you said we can set x_1 = t * cosd (alpha) and x_2 = t * sind (alpha).
> > 
> > This means that x_1 = x_2 cosd(alpha)/sind(alpha) right ?
> ================
> 
> This manipulation requires a division operation that is only valid for t~=0 and alpha~=0. 
> 
> It was not my intention that you do this. My intention was that you substitute both
> x_1 = t * cosd (alpha) and x_2 = t * sind (alpha)
> into the cost function so that, with x_3 and alpha held fixed, the cost function would become a function of t, not x_2.
> 
> The result is virtually the same - you end up with a 1D 4th order polynomial to minimize.
> However, in your approach, you have to worry about beta being infinite...
> 
> 
>  So if we set beta = cosd(alpha)/sind(alpha) then we have in the first case (x_3 = 0):
> > 
> > J_1 (x_2) = [beta*x_2 x_2 0]*Q*[[beta*x_2 x_2 0]^T
> > 
> > If I do the multiplications correctly then that gives me a polynomial of degree 2 ?! Of course lsqnonlin will sum the squares which would make this degree 4, but what I am saying is correct right?
> ===================
> 
> You shouldn't be using lsqnonlin for this. lsqnonlin does not know how to minimize anything analytically, even a simple 1D polynomial. You should be writing your own routine to do this.
> 
> To keep this discussion clear, it would also be a good idea if you stop treating lsqnonlin as the thing that's defining the cost function. WE are defining the cost function for this problem and it is the multi-variable 4th order polynomial
> 
> J(x)=sum_i (x' * Q{i} *x)^2
> 
> There is no other obvious and more tractable alternative to this cost function for the task you've described.
> 
> 
> > Also you said I should compute J_1 for each value of alpha from 0:0.1:180 which would make the problem about 1800 iterations for finding x_2. I can then simply find x_1 by computing beta*x_2 and we know that  x_3 in this case is 0. Repeat all this for the case that x_3 is 1. Which means 3600 iterations in total.
> ===============
> 
> One other thing. It has occured to me that you don't really have to do this for the case x_3=0
> 
> In this case, the remaining unknowns x_1 and x_2 can also be normalized with impunity so that x_1=1 or x_2=1 as long as one of these is non-zero. So, you can subdivide the case x_3=0 into 2 more subcases
> 
> (1) x_1=1, x_3=0
> (2) x_2=1, x_3=0
> 
> In each of these sub-cases, the cost function reduces to a 1D polynomial again, which you can minimize trivially.

Matt, yes, sorry about this!

Re: My intention was that you substitute both
> x_1 = t * cosd (alpha) and x_2 = t * sind (alpha)
> into the cost function

Yes, just realized that before you answered. Got you now!

However I do not understand your last point:

> One other thing. It has occured to me that you don't really have to do this for the case x_3=0
> 
> In this case, the remaining unknowns x_1 and x_2 can also be normalized with impunity so that x_1=1 or x_2=1 as long as one of these is non-zero. So, you can subdivide the case x_3=0 into 2 more subcases.

Could you elaborate this a little bit please, I'm not sure I get you. You are basically saying than when x_3 = 0 then x_1 and x_2 ~= 0 ?

I don't really understand.

Thanks!