The function returns the Orthogonal Linear Least Squares estimate for parameters of line ax+by+c=0
ORTHLLS2D returns the Orthogonal Linear Least Squares estimate for parameters of line a x + b y + c = 0
function f = OrthLLS2D(x, y)
Inputs x and y must be real vectors of equal size.
Output f is the real vector [a b c] where a, b and c are the estimated parameters of the linear equation.
Since a more general function called LINORTFITN has already been submitted to File Exchange (ID number: 16800) in October 2007 by Mr. F. Carr, my file is supposed to be used as a brief introduction to the analytical problem in an extremely simple case.
Orthogonal Least Squares Estimate on a plane, in the simple case of a linear equation, is in fact a problem that can be easily solved analytically with no approximation (see pdf file for detailed explanation). Notice that in the general multidimensional case, an analytical solution may not exist (although Mr. Carr's function is an efficient approximation of the solution).
% ====================================================
% EXAMPLE: HOW TO USE THE FUNCTION
% ====================================================
Build two series, given the linear relation y = mx + q + error
T = 1000; % number of points
m = -1; % slope
q = 1; % intercept
x = randn(T, 1); % random x values
u = randn(T, 1); % random error
y = m * x + q + u; % y = mx + q + error
f = OrthLLS2D(x, y); % estimate [a, b, c] for equation ax + by + c = 0
plot(x, y, '.') % scatter plot for empirical points
hold on
plot(x, [ones(T,1) x] * [-f(3); -f(1)], '*r');% plot orthogonal linear least squares
b1 = regress(y, [ones(T,1) x]); % parameters of the ordinary least squares (y as a function of x)
b2 = regress(x, [ones(T,1) y]); % parameters of the ordinary least squares (x as a function of y)
plot(x, [ones(T,1) x] * b1, '.g'); % plot ordinary least squares (y as a function of x)
plot([ones(T,1) y] * b2, y, '.c'); % plot ordinary least squares (x as a function of y)
% ====================================================
See also LINORTFIT2, LINORTFITN.
1.1 | minor edits |
Inspired by: Orthogonal Linear Regression
Inspired: Weighted Orthogonal Least Squares
Liber Eleutherios (view profile)
@Tharindu,
the command
plot(x, [ones(T,1) x] * [-f(3); -f(1)], 'r')
plots the line by using the values in x.
You could as well do the opposite and plot the line by using the values in y: it would still be the same line.
The function does not estimate [Xhat, Yhat] but, yes, I agree that an acceptable way to estimate them would be to project [x, y] orthogonally on the line.
In case the elements of x present a unique value, the function will correctly return infinity for first and third parameter and 1 for the second one.
Then I reckon an acceptable way to plot the line would just be:
plot(x(1:2), [min(y), max(y)])
Note that this is beyond the scope of the function though.
Tharindu Weerakoon (view profile)
Hi Liber,
What will happen to the output of this function if the gradient of the computed line is _infinity_ ?
For example:
x=[6.2 6.0 6.2 6.0 6.2]
y=[12 10 8 6 4 ]
Which should give a vertical line but in between y range.
But it gives vertical line with infinite length.
So how to solve this issue?
Thanks
Tharindu Weerakoon (view profile)
Hi Liber,
I have one thing to know about the outputs of the function.
You have plotted the line using
plot(x, [ones(T,1) x] * [-f(3); -f(1)], 'r')
There the x is an independent variable. Isn't it?
If I compute the both estimated values for x and y as [Xhat, Yhat] how I can do?
How to calculate the most accurate data set?
Are [Xhat, Yhat] the orthogonal points to from the original points to the line drawn using orthogonal LR?
Thanks
Liber Eleutherios (view profile)
Hi Royi, your remark is certainly correct. See LINORTFITN (FE16800 by F. Carr) for the general case.
My function is only meant to find the analytic solution to the 2D problem.
Royi Avital (view profile)
Hi,
I think it could be done much faster and easier using the SVD.
ln (view profile)