Got Questions? Get Answers.
Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
linear regression, X\Y vs. Y\X

Subject: linear regression, X\Y vs. Y\X

From: KC

Date: 13 Mar, 2009 18:20:18

Message: 1 of 4

I can't figure out why X\Y is not reciprocal of Y\X, given the following:

X = [1 2 4 5 7 9 11 13 14 16]'; Y = [101 105 109 112 117 116 122 123 129 130]';
X\Y
ans =
   10.8900
Y\X
ans =
    0.0733
1/(Y\X)
ans =
   13.6391
I am not sure I know why. I'd appreciate your explanations.
Thanks.

Subject: linear regression, X\Y vs. Y\X

From: Bruno Luong

Date: 13 Mar, 2009 18:42:02

Message: 2 of 4

"KC " <kctung75034@gmail.com> wrote in message <gpe852$lsc$1@fred.mathworks.com>...
> I can't figure out why X\Y is not reciprocal of Y\X, given the following:
>
> X = [1 2 4 5 7 9 11 13 14 16]'; Y = [101 105 109 112 117 116 122 123 129 130]';
> X\Y
> ans =
> 10.8900

This is least-square on Y, ie, a:=X\Y minimizes norm(Y - a*X) (for all real number a)

> Y\X
> ans =
> 0.0733

This is least-square on X, ie, b:=Y\X minimizes norm(X - b*Y) (for all real number b)

Not the same thing.

% Least square Y
norm(Y-X*a)
norm(Y-X/b) % larger

% Least square X
norm(Y*b-X)
norm(Y/a-X) % larger

Bruno

Subject: linear regression, X\Y vs. Y\X

From: Roger Stafford

Date: 14 Mar, 2009 01:54:00

Message: 3 of 4

"KC " <kctung75034@gmail.com> wrote in message <gpe852$lsc$1@fred.mathworks.com>...
> I can't figure out why X\Y is not reciprocal of Y\X, given the following:
>
> X = [1 2 4 5 7 9 11 13 14 16]'; Y = [101 105 109 112 117 116 122 123 129 130]';
> ........

  I think what is bothering you, KC, is why you don't get the same (homogeneous) regression line in the two methods. If the one answer were the reciprocal of the other, then the two regression lines would be the same line.

  It should interest you to learn that there is only way these lines can be the same, and that is for all the given points to be colinear with some line through the origin; in other words all the x and y coordinates must be proportional.

  The two kinds of regression are based on differing assumptions. In the one it is assumed that all the errors lie in the x-coordinates, while the other kind assumes that all the errors are in the y-coordinates. It is therefore not surprising that one will obtain differing approximating lines.

  There is another method using eigenvectors which assumes that both x and y are subject to equal amounts of errors, and this produces yet a third line positioned between the other two. None of the three lines is the same except when the original data lies strictly along a line through the origin.

Roger Stafford

Subject: linear regression, X\Y vs. Y\X

From: shinchan

Date: 24 Jun, 2009 23:07:01

Message: 4 of 4

Hi Roger,
I would like to know how to solve this using an eigenvectors as you mentioned. Do you mean that the eigenvalue is the constant term that project X to Y, i.e., the proportionality between X and Y? I am not sure how to conceptualize this into a eigenvector problem, or put it in Matlab code. Please help.
Thansk.


"Roger Stafford" <ellieandrogerxyzzy@mindspring.com.invalid> wrote in message <gpf2no$hp5$1@fred.mathworks.com>...
> "KC " <kctung75034@gmail.com> wrote in message <gpe852$lsc$1@fred.mathworks.com>...
> > I can't figure out why X\Y is not reciprocal of Y\X, given the following:
> >
> > X = [1 2 4 5 7 9 11 13 14 16]'; Y = [101 105 109 112 117 116 122 123 129 130]';
> > ........
>
> I think what is bothering you, KC, is why you don't get the same (homogeneous) regression line in the two methods. If the one answer were the reciprocal of the other, then the two regression lines would be the same line.
>
> It should interest you to learn that there is only way these lines can be the same, and that is for all the given points to be colinear with some line through the origin; in other words all the x and y coordinates must be proportional.
>
> The two kinds of regression are based on differing assumptions. In the one it is assumed that all the errors lie in the x-coordinates, while the other kind assumes that all the errors are in the y-coordinates. It is therefore not surprising that one will obtain differing approximating lines.
>
> There is another method using eigenvectors which assumes that both x and y are subject to equal amounts of errors, and this produces yet a third line positioned between the other two. None of the three lines is the same except when the original data lies strictly along a line through the origin.
>
> Roger Stafford

Tags for this Thread

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us