I think the answer is to turn off the const in the robustfit function, but then put a column of ones in the X matrix and then apply the weights to X and y....but thanks
"Mike " <mike.curran@rbccm.com> wrote in message <icdvft$1vd$1@fred.mathworks.com>...
>
> Ok, now I remember how to do weighting, but there is no constant in the robustfit argument list, so can I still accomodate weights in a consistent way?
>
> thanks
>
>
> "Miroslav Balda" <miroslav.nospam@balda.cz> wrote in message <ic9klv$aec$1@fred.mathworks.com>...
> > "Mike " <mike.curran@rbccm.com> wrote in message <ic6p58$bnb$1@fred.mathworks.com>...
> > >
> > > I want to do weighted linear least squares, but it seems the stat toolbox only allows weights for nonlinear regression. Is there a way to do this besides increasing the number of points and just have repeats that approximate the desired weights?
> > >
> > > I actually want to do weighted robust linear least squares  is this asking too much?
> > >
> > >
> > > thanks
> >
> > Hi Mike,
> >
> > The problem of linear regression is rather simple. It is based on the solution of a system of overdetermined system of linear equations
> > A*c = b,
> > where
> > A is a matrix containing columns of known function values, say f_k(x),
> > b is a vector of values of measured function of x to be approximated by a product A*c, and
> > c is a vector of coefficients linear combination of the functions f_k(x).
> > The solution in the least squares sense is
> > c = A\b;
> > Values of the regression function are A*c.
> > For example, polynomial regression may be solved by setting matrix
> > A = [ones(m,1), x, x.^2, x.^3, ... , x.^n ],
> > where m is the length of the vector b, and n is a polynom degree.
> >
> > The weighted problem is generated by weighing left an right hand sides by the same diagonal matrix of chosen weights W:
> > W*A*c = W*b;
> > The solution is formally just the same:
> > c = (W*A)\(W*b);
> >
> > You do not need any toolboxes for solution of linear regression.
> > Good luck.
> >
> > Mira
