http://www.mathworks.com/matlabcentral/newsreader/view_thread/246620
MATLAB Central Newsreader  linear regression, X\Y vs. Y\X
Feed for thread: linear regression, X\Y vs. Y\X
enus
©19942014 by MathWorks, Inc.
webmaster@mathworks.com
MATLAB Central Newsreader
http://blogs.law.harvard.edu/tech/rss
60
MathWorks
http://www.mathworks.com/images/membrane_icon.gif

Fri, 13 Mar 2009 18:20:18 +0000
linear regression, X\Y vs. Y\X
http://www.mathworks.com/matlabcentral/newsreader/view_thread/246620#634690
KC
I can't figure out why X\Y is not reciprocal of Y\X, given the following: <br>
<br>
X = [1 2 4 5 7 9 11 13 14 16]'; Y = [101 105 109 112 117 116 122 123 129 130]';<br>
X\Y<br>
ans =<br>
10.8900<br>
Y\X<br>
ans =<br>
0.0733<br>
1/(Y\X)<br>
ans =<br>
13.6391<br>
I am not sure I know why. I'd appreciate your explanations.<br>
Thanks.

Fri, 13 Mar 2009 18:42:02 +0000
Re: linear regression, X\Y vs. Y\X
http://www.mathworks.com/matlabcentral/newsreader/view_thread/246620#634696
Bruno Luong
"KC " <kctung75034@gmail.com> wrote in message <gpe852$lsc$1@fred.mathworks.com>...<br>
> I can't figure out why X\Y is not reciprocal of Y\X, given the following: <br>
> <br>
> X = [1 2 4 5 7 9 11 13 14 16]'; Y = [101 105 109 112 117 116 122 123 129 130]';<br>
> X\Y<br>
> ans =<br>
> 10.8900<br>
<br>
This is leastsquare on Y, ie, a:=X\Y minimizes norm(Y  a*X) (for all real number a)<br>
<br>
> Y\X<br>
> ans =<br>
> 0.0733<br>
<br>
This is leastsquare on X, ie, b:=Y\X minimizes norm(X  b*Y) (for all real number b)<br>
<br>
Not the same thing.<br>
<br>
% Least square Y<br>
norm(YX*a)<br>
norm(YX/b) % larger<br>
<br>
% Least square X<br>
norm(Y*bX)<br>
norm(Y/aX) % larger<br>
<br>
Bruno

Sat, 14 Mar 2009 01:54:00 +0000
Re: linear regression, X\Y vs. Y\X
http://www.mathworks.com/matlabcentral/newsreader/view_thread/246620#634770
Roger Stafford
"KC " <kctung75034@gmail.com> wrote in message <gpe852$lsc$1@fred.mathworks.com>...<br>
> I can't figure out why X\Y is not reciprocal of Y\X, given the following: <br>
> <br>
> X = [1 2 4 5 7 9 11 13 14 16]'; Y = [101 105 109 112 117 116 122 123 129 130]';<br>
> ........<br>
<br>
I think what is bothering you, KC, is why you don't get the same (homogeneous) regression line in the two methods. If the one answer were the reciprocal of the other, then the two regression lines would be the same line.<br>
<br>
It should interest you to learn that there is only way these lines can be the same, and that is for all the given points to be colinear with some line through the origin; in other words all the x and y coordinates must be proportional.<br>
<br>
The two kinds of regression are based on differing assumptions. In the one it is assumed that all the errors lie in the xcoordinates, while the other kind assumes that all the errors are in the ycoordinates. It is therefore not surprising that one will obtain differing approximating lines.<br>
<br>
There is another method using eigenvectors which assumes that both x and y are subject to equal amounts of errors, and this produces yet a third line positioned between the other two. None of the three lines is the same except when the original data lies strictly along a line through the origin.<br>
<br>
Roger Stafford

Wed, 24 Jun 2009 23:07:01 +0000
Re: linear regression, X\Y vs. Y\X
http://www.mathworks.com/matlabcentral/newsreader/view_thread/246620#660300
shinchan
Hi Roger,<br>
I would like to know how to solve this using an eigenvectors as you mentioned. Do you mean that the eigenvalue is the constant term that project X to Y, i.e., the proportionality between X and Y? I am not sure how to conceptualize this into a eigenvector problem, or put it in Matlab code. Please help.<br>
Thansk.<br>
<br>
<br>
"Roger Stafford" <ellieandrogerxyzzy@mindspring.com.invalid> wrote in message <gpf2no$hp5$1@fred.mathworks.com>...<br>
> "KC " <kctung75034@gmail.com> wrote in message <gpe852$lsc$1@fred.mathworks.com>...<br>
> > I can't figure out why X\Y is not reciprocal of Y\X, given the following: <br>
> > <br>
> > X = [1 2 4 5 7 9 11 13 14 16]'; Y = [101 105 109 112 117 116 122 123 129 130]';<br>
> > ........<br>
> <br>
> I think what is bothering you, KC, is why you don't get the same (homogeneous) regression line in the two methods. If the one answer were the reciprocal of the other, then the two regression lines would be the same line.<br>
> <br>
> It should interest you to learn that there is only way these lines can be the same, and that is for all the given points to be colinear with some line through the origin; in other words all the x and y coordinates must be proportional.<br>
> <br>
> The two kinds of regression are based on differing assumptions. In the one it is assumed that all the errors lie in the xcoordinates, while the other kind assumes that all the errors are in the ycoordinates. It is therefore not surprising that one will obtain differing approximating lines.<br>
> <br>
> There is another method using eigenvectors which assumes that both x and y are subject to equal amounts of errors, and this produces yet a third line positioned between the other two. None of the three lines is the same except when the original data lies strictly along a line through the origin.<br>
> <br>
> Roger Stafford