From: <HIDDEN>
Newsgroups: comp.soft-sys.matlab
Subject: Re: Question on the derivate /calculus of a 2-norm matrix . Thanks a lot
Date: Mon, 26 Jul 2010 03:33:05 +0000 (UTC)
Organization: Anhui University
Lines: 23
Message-ID: <i2ivlh$dka$>
References: <i2he90$21a$> <i2iran$deb$> <i2isv7$pi7$> <i2itv4$rr8$>
Reply-To: <HIDDEN>
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
X-Trace: 1280115185 13962 (26 Jul 2010 03:33:05 GMT)
NNTP-Posting-Date: Mon, 26 Jul 2010 03:33:05 +0000 (UTC)
X-Newsreader: MATLAB Central Newsreader 2419503
Xref: comp.soft-sys.matlab:656064

"Matt J " <mattjacREMOVE@THISieee.spam> wrote in message <i2itv4$rr8$>...
> "Antony " <> wrote in message <i2isv7$pi7$>...
> > But, according to the chain rule, I may apply it to f(X)=||KX-B||^0.6 and obtain the result of the derivate as 0.6*K.'*(K*X-B)^{-0.4}? 
> ======================
> No, this wouldn't be the correct expression. From my last post, I get, after some simplification
> Gradient = 0.6*K.'*(K*X-B)/||K*X-B||^(1.4)
> >This result seems rather complex for some numerical optimization.
> =======================
> Well, your objective function f(X)=||KX-B||^0.6 is unusually complex...
> For one thing, this function is not differentiable at points where K*X=B, which means that if the minimum lies there, you cannot use gradient-based approaches to find it.

Dear Matt, thank a lot for your time in my question. I appreciate your help! I understand the difficulties of such type of optimization problems now.  This might be the reason that papers always figure out another efficient solutions to such type of non-convex problems. Thanks again!

Also, thanks a lot for all other guys' kind and patient helps, especially to  Roger Stafford and Brian Borchers.