Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
Least square inversion (normal equation vs Ax=b solution)

Subject: Least square inversion (normal equation vs Ax=b solution)

From: Michal Kolaj

Date: 15 Jul, 2012 03:25:21

Message: 1 of 4

Hello All,

Looking for some help regarding some linear algebra in Matlab (Linear Least Square Inversion).

I have a problem of the form Ax=b+e where A is full rank and is sparse and e is some error. I also have a regularization matrix (R) and standard deviation error vector w
I am trying to minimize:

(b - A*x)'*diag(1/w)*(b - A*x) + y(R'*x')(R*x)

(i.e. least square with regularization and known error (weights) where y is the scaling factor for the regularization component)

This should be equal to solving the normal equation:

x = inv(A'*diag(1/w)*A+y(R'*R))*A'*diag(1/w)*b

I read somewhere that solving an inverse is not the best way so it would be better to reformulate the above problem as (Ax=b):

x =[(A'*diag(1/w)*A+y(R'*R))]\[A'*diag(1/w)*b]

But then I read another thing that says its better to avoid normal equations altogether which then would make the solution something like:

x=[A*diag(1/w);yR]\[b*diag(1/w);zeros]

However, the solution using the normal equation and the one above does not match. I would be glad if someone could point out my error.

Thanks in advance.

Subject: Least square inversion (normal equation vs Ax=b solution)

From: Bruno Luong

Date: 15 Jul, 2012 07:03:28

Message: 2 of 4

You forget the square-root

When minimizing

f(x) := (Ax-b)'*W*(Ax-b) + || R*x ||^2
W sdp matrix (W = diag(1./w) in your case)

The solution is

x = (A'*W*A + R'*R) \ (A'*W*b)

% or
sqrtW = sqrtm(W); % diag(1./sqrt(w)) in your case
x = [sqrtW*A;
     R] \ [sqrtW*b; zeros(length(R),1)]

% Bruno

Subject: Least square inversion (normal equation vs Ax=b solution)

From: John D'Errico

Date: 15 Jul, 2012 10:07:07

Message: 3 of 4

"Michal Kolaj" wrote in message <jttd71$6fi$1@newscl01ah.mathworks.com>...
> Hello All,
>
> Looking for some help regarding some linear algebra in Matlab (Linear Least Square Inversion).
>
> I have a problem of the form Ax=b+e where A is full rank and is sparse and e is some error. I also have a regularization matrix (R) and standard deviation error vector w
> I am trying to minimize:
>
> (b - A*x)'*diag(1/w)*(b - A*x) + y(R'*x')(R*x)
>
> (i.e. least square with regularization and known error (weights) where y is the scaling factor for the regularization component)
>
> This should be equal to solving the normal equation:
>
> x = inv(A'*diag(1/w)*A+y(R'*R))*A'*diag(1/w)*b
>
> I read somewhere that solving an inverse is not the best way so it would be better to reformulate the above problem as (Ax=b):
>
> x =[(A'*diag(1/w)*A+y(R'*R))]\[A'*diag(1/w)*b]
>
> But then I read another thing that says its better to avoid normal equations altogether which then would make the solution something like:
>
> x=[A*diag(1/w);yR]\[b*diag(1/w);zeros]
>
> However, the solution using the normal equation and the one above does not match. I would be glad if someone could point out my error.
>
> Thanks in advance.

In addition to what Bruno has said, IF your matrix
is sparse, then do NOT use diag to create a diagonal
matrix.

Use spdiags instead.

John

Subject: Least square inversion (normal equation vs Ax=b solution)

From: Michal Kolaj

Date: 15 Jul, 2012 16:01:13

Message: 4 of 4

"John D'Errico" <woodchips@rochester.rr.com> wrote in message <jtu4ob$oki$1@newscl01ah.mathworks.com>...
> "Michal Kolaj" wrote in message <jttd71$6fi$1@newscl01ah.mathworks.com>...
> > Hello All,
> >
> > Looking for some help regarding some linear algebra in Matlab (Linear Least Square Inversion).
> >
> > I have a problem of the form Ax=b+e where A is full rank and is sparse and e is some error. I also have a regularization matrix (R) and standard deviation error vector w
> > I am trying to minimize:
> >
> > (b - A*x)'*diag(1/w)*(b - A*x) + y(R'*x')(R*x)
> >
> > (i.e. least square with regularization and known error (weights) where y is the scaling factor for the regularization component)
> >
> > This should be equal to solving the normal equation:
> >
> > x = inv(A'*diag(1/w)*A+y(R'*R))*A'*diag(1/w)*b
> >
> > I read somewhere that solving an inverse is not the best way so it would be better to reformulate the above problem as (Ax=b):
> >
> > x =[(A'*diag(1/w)*A+y(R'*R))]\[A'*diag(1/w)*b]
> >
> > But then I read another thing that says its better to avoid normal equations altogether which then would make the solution something like:
> >
> > x=[A*diag(1/w);yR]\[b*diag(1/w);zeros]
> >
> > However, the solution using the normal equation and the one above does not match. I would be glad if someone could point out my error.
> >
> > Thanks in advance.
>
> In addition to what Bruno has said, IF your matrix
> is sparse, then do NOT use diag to create a diagonal
> matrix.
>
> Use spdiags instead.
>
> John

Thanks Bruno and John! I knew it would be an embarrassing mistake!

Tags for this Thread

No tags are associated with this thread.

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us