Got Questions? Get Answers.
Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
robust weighted least squares

Subject: robust weighted least squares

From: Mike

Date: 19 Nov, 2010 21:11:04

Message: 1 of 4


I want to do weighted linear least squares, but it seems the stat toolbox only allows weights for non-linear regression. Is there a way to do this besides increasing the number of points and just have repeats that approximate the desired weights?

I actually want to do weighted robust linear least squares - is this asking too much?


thanks

Subject: robust weighted least squares

From: Miroslav Balda

Date: 20 Nov, 2010 23:13:03

Message: 2 of 4

"Mike " <mike.curran@rbccm.com> wrote in message <ic6p58$bnb$1@fred.mathworks.com>...
>
> I want to do weighted linear least squares, but it seems the stat toolbox only allows weights for non-linear regression. Is there a way to do this besides increasing the number of points and just have repeats that approximate the desired weights?
>
> I actually want to do weighted robust linear least squares - is this asking too much?
>
>
> thanks

Hi Mike,

The problem of linear regression is rather simple. It is based on the solution of a system of overdetermined system of linear equations
     A*c = b,
where
A is a matrix containing columns of known function values, say f_k(x),
b is a vector of values of measured function of x to be approximated by a product A*c, and
c is a vector of coefficients linear combination of the functions f_k(x).
The solution in the least squares sense is
     c = A\b;
Values of the regression function are A*c.
For example, polynomial regression may be solved by setting matrix
     A = [ones(m,1), x, x.^2, x.^3, ... , x.^n ],
where m is the length of the vector b, and n is a polynom degree.

The weighted problem is generated by weighing left an right hand sides by the same diagonal matrix of chosen weights W:
     W*A*c = W*b;
The solution is formally just the same:
      c = (W*A)\(W*b);

You do not need any toolboxes for solution of linear regression.
Good luck.

Mira

Subject: robust weighted least squares

From: Mike

Date: 22 Nov, 2010 14:42:05

Message: 3 of 4


Ok, now I remember how to do weighting, but there is no constant in the robustfit argument list, so can I still accomodate weights in a consistent way?

thanks


"Miroslav Balda" <miroslav.nospam@balda.cz> wrote in message <ic9klv$aec$1@fred.mathworks.com>...
> "Mike " <mike.curran@rbccm.com> wrote in message <ic6p58$bnb$1@fred.mathworks.com>...
> >
> > I want to do weighted linear least squares, but it seems the stat toolbox only allows weights for non-linear regression. Is there a way to do this besides increasing the number of points and just have repeats that approximate the desired weights?
> >
> > I actually want to do weighted robust linear least squares - is this asking too much?
> >
> >
> > thanks
>
> Hi Mike,
>
> The problem of linear regression is rather simple. It is based on the solution of a system of overdetermined system of linear equations
> A*c = b,
> where
> A is a matrix containing columns of known function values, say f_k(x),
> b is a vector of values of measured function of x to be approximated by a product A*c, and
> c is a vector of coefficients linear combination of the functions f_k(x).
> The solution in the least squares sense is
> c = A\b;
> Values of the regression function are A*c.
> For example, polynomial regression may be solved by setting matrix
> A = [ones(m,1), x, x.^2, x.^3, ... , x.^n ],
> where m is the length of the vector b, and n is a polynom degree.
>
> The weighted problem is generated by weighing left an right hand sides by the same diagonal matrix of chosen weights W:
> W*A*c = W*b;
> The solution is formally just the same:
> c = (W*A)\(W*b);
>
> You do not need any toolboxes for solution of linear regression.
> Good luck.
>
> Mira

Subject: robust weighted least squares

From: Mike

Date: 22 Nov, 2010 16:48:04

Message: 4 of 4


I think the answer is to turn off the const in the robustfit function, but then put a column of ones in the X matrix and then apply the weights to X and y....but thanks

"Mike " <mike.curran@rbccm.com> wrote in message <icdvft$1vd$1@fred.mathworks.com>...
>
> Ok, now I remember how to do weighting, but there is no constant in the robustfit argument list, so can I still accomodate weights in a consistent way?
>
> thanks
>
>
> "Miroslav Balda" <miroslav.nospam@balda.cz> wrote in message <ic9klv$aec$1@fred.mathworks.com>...
> > "Mike " <mike.curran@rbccm.com> wrote in message <ic6p58$bnb$1@fred.mathworks.com>...
> > >
> > > I want to do weighted linear least squares, but it seems the stat toolbox only allows weights for non-linear regression. Is there a way to do this besides increasing the number of points and just have repeats that approximate the desired weights?
> > >
> > > I actually want to do weighted robust linear least squares - is this asking too much?
> > >
> > >
> > > thanks
> >
> > Hi Mike,
> >
> > The problem of linear regression is rather simple. It is based on the solution of a system of overdetermined system of linear equations
> > A*c = b,
> > where
> > A is a matrix containing columns of known function values, say f_k(x),
> > b is a vector of values of measured function of x to be approximated by a product A*c, and
> > c is a vector of coefficients linear combination of the functions f_k(x).
> > The solution in the least squares sense is
> > c = A\b;
> > Values of the regression function are A*c.
> > For example, polynomial regression may be solved by setting matrix
> > A = [ones(m,1), x, x.^2, x.^3, ... , x.^n ],
> > where m is the length of the vector b, and n is a polynom degree.
> >
> > The weighted problem is generated by weighing left an right hand sides by the same diagonal matrix of chosen weights W:
> > W*A*c = W*b;
> > The solution is formally just the same:
> > c = (W*A)\(W*b);
> >
> > You do not need any toolboxes for solution of linear regression.
> > Good luck.
> >
> > Mira

Tags for this Thread

No tags are associated with this thread.

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us