"Michal Kolaj" wrote in message <jttd71$6fi$1@newscl01ah.mathworks.com>...
> Hello All,
>
> Looking for some help regarding some linear algebra in Matlab (Linear Least Square Inversion).
>
> I have a problem of the form Ax=b+e where A is full rank and is sparse and e is some error. I also have a regularization matrix (R) and standard deviation error vector w
> I am trying to minimize:
>
> (b  A*x)'*diag(1/w)*(b  A*x) + y(R'*x')(R*x)
>
> (i.e. least square with regularization and known error (weights) where y is the scaling factor for the regularization component)
>
> This should be equal to solving the normal equation:
>
> x = inv(A'*diag(1/w)*A+y(R'*R))*A'*diag(1/w)*b
>
> I read somewhere that solving an inverse is not the best way so it would be better to reformulate the above problem as (Ax=b):
>
> x =[(A'*diag(1/w)*A+y(R'*R))]\[A'*diag(1/w)*b]
>
> But then I read another thing that says its better to avoid normal equations altogether which then would make the solution something like:
>
> x=[A*diag(1/w);yR]\[b*diag(1/w);zeros]
>
> However, the solution using the normal equation and the one above does not match. I would be glad if someone could point out my error.
>
> Thanks in advance.
In addition to what Bruno has said, IF your matrix
is sparse, then do NOT use diag to create a diagonal
matrix.
Use spdiags instead.
John
