Discover MakerZone

MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

Learn more

Discover what MATLAB® can do for your career.

Opportunities for recent engineering grads.

Apply Today

Thread Subject:
least-squares fit with constraint

Subject: least-squares fit with constraint

From: gkk gkk

Date: 2 Dec, 2010 01:28:05

Message: 1 of 9

Hi, Is there a way in Matlab to perform a least-squares fit on a column vector, with the constraint that the slope is zero?

I have a column vector of periods extracted from an periodic waveform, and I'm trying to find the average period, where "average period" is defined as that period having the best least-squares fit.

If I run p=polyfit(x,y,1), it will return a line having a slope (albeit a small slope). Anyway to do something similar but constrain the LSF operation that the slope must be zero?

Thanks in advance

Subject: least-squares fit with constraint

From: Richard Startz

Date: 2 Dec, 2010 01:44:54

Message: 2 of 9

On Thu, 2 Dec 2010 01:28:05 +0000 (UTC), "gkk gkk"
<gkkmath@comcast.net> wrote:

>Hi, Is there a way in Matlab to perform a least-squares fit on a column vector, with the constraint that the slope is zero?
>
>I have a column vector of periods extracted from an periodic waveform, and I'm trying to find the average period, where "average period" is defined as that period having the best least-squares fit.
>
>If I run p=polyfit(x,y,1), it will return a line having a slope (albeit a small slope). Anyway to do something similar but constrain the LSF operation that the slope must be zero?
>
>Thanks in advance

If the slope is zero, then that's the same as leaving the RHS variable
out of the regression. The intercept is just the mean of the dependent
variable.

Subject: least-squares fit with constraint

From: gkk gkk

Date: 2 Dec, 2010 01:59:05

Message: 3 of 9

Richard Startz <richardstartz@comcast.net> wrote in message <6fudf6d8e8njopfrs3gv76vq97ant76kms@4ax.com>...
> On Thu, 2 Dec 2010 01:28:05 +0000 (UTC), "gkk gkk"
> <gkkmath@comcast.net> wrote:
>
> >Hi, Is there a way in Matlab to perform a least-squares fit on a column vector, with the constraint that the slope is zero?
> >
> >I have a column vector of periods extracted from an periodic waveform, and I'm trying to find the average period, where "average period" is defined as that period having the best least-squares fit.
> >
> >If I run p=polyfit(x,y,1), it will return a line having a slope (albeit a small slope). Anyway to do something similar but constrain the LSF operation that the slope must be zero?
> >
> >Thanks in advance
>
> If the slope is zero, then that's the same as leaving the RHS variable
> out of the regression. The intercept is just the mean of the dependent
> variable.

Thanks Richard, I'm not sure I understand. Are you saying if I enter:

p=polyfit(x,y,1)

and it returns:

p(1) = 1.323e-12 (i.e. slope)
p(2) = 9.582e-8 (i.e. intercept)

then p(2) is equivalent to performing LSF on (x,y) with the constraint that the slope must be zero? It seems counter-intuitive since the fit is made by minimizing the difference between the line and the data over the whole curve (e.g. I wouldn't necessarily expect the intercept from the fitted curve with a non-zero slope to be the best fit for a zero-slope line).

Subject: least-squares fit with constraint

From: gkk gkk

Date: 2 Dec, 2010 02:34:05

Message: 4 of 9

"gkk gkk" <gkkmath@comcast.net> wrote in message <id6uh9$3hn$1@fred.mathworks.com>...
> Richard Startz <richardstartz@comcast.net> wrote in message <6fudf6d8e8njopfrs3gv76vq97ant76kms@4ax.com>...
> > On Thu, 2 Dec 2010 01:28:05 +0000 (UTC), "gkk gkk"
> > <gkkmath@comcast.net> wrote:
> >
> > >Hi, Is there a way in Matlab to perform a least-squares fit on a column vector, with the constraint that the slope is zero?
> > >
> > >I have a column vector of periods extracted from an periodic waveform, and I'm trying to find the average period, where "average period" is defined as that period having the best least-squares fit.
> > >
> > >If I run p=polyfit(x,y,1), it will return a line having a slope (albeit a small slope). Anyway to do something similar but constrain the LSF operation that the slope must be zero?
> > >
> > >Thanks in advance
> >
> > If the slope is zero, then that's the same as leaving the RHS variable
> > out of the regression. The intercept is just the mean of the dependent
> > variable.
>
> Thanks Richard, I'm not sure I understand. Are you saying if I enter:
>
> p=polyfit(x,y,1)
>
> and it returns:
>
> p(1) = 1.323e-12 (i.e. slope)
> p(2) = 9.582e-8 (i.e. intercept)
>
> then p(2) is equivalent to performing LSF on (x,y) with the constraint that the slope must be zero? It seems counter-intuitive since the fit is made by minimizing the difference between the line and the data over the whole curve (e.g. I wouldn't necessarily expect the intercept from the fitted curve with a non-zero slope to be the best fit for a zero-slope line).

Some more info:

If my period array is "per", I can use corrcoef() function to compare fits of:
1. a line obtained using the p=polyfit(x,y,1) function (just using the intercept, p(2))
2. simply computing: mean(per).

The polyfit function shows higher correlation, so the LSF routine appears to be doing something better than simply taking the mean of the array.

Subject: least-squares fit with constraint

From: John D'Errico

Date: 2 Dec, 2010 11:19:05

Message: 5 of 9

"gkk gkk" <gkkmath@comcast.net> wrote in message <id6uh9$3hn$1@fred.mathworks.com>...
> Richard Startz <richardstartz@comcast.net> wrote in message <6fudf6d8e8njopfrs3gv76vq97ant76kms@4ax.com>...
> > On Thu, 2 Dec 2010 01:28:05 +0000 (UTC), "gkk gkk"
> > <gkkmath@comcast.net> wrote:
> >
> > >Hi, Is there a way in Matlab to perform a least-squares fit on a column vector, with the constraint that the slope is zero?
> > >
> > >I have a column vector of periods extracted from an periodic waveform, and I'm trying to find the average period, where "average period" is defined as that period having the best least-squares fit.
> > >
> > >If I run p=polyfit(x,y,1), it will return a line having a slope (albeit a small slope). Anyway to do something similar but constrain the LSF operation that the slope must be zero?
> > >
> > >Thanks in advance
> >
> > If the slope is zero, then that's the same as leaving the RHS variable
> > out of the regression. The intercept is just the mean of the dependent
> > variable.
>
> Thanks Richard, I'm not sure I understand. Are you saying if I enter:
>
> p=polyfit(x,y,1)
>
> and it returns:
>
> p(1) = 1.323e-12 (i.e. slope)
> p(2) = 9.582e-8 (i.e. intercept)
>
> then p(2) is equivalent to performing LSF on (x,y) with the constraint that the slope must be zero? It seems counter-intuitive since the fit is made by minimizing the difference between the line and the data over the whole curve (e.g. I wouldn't necessarily expect the intercept from the fitted curve with a non-zero slope to be the best fit for a zero-slope line).

No. He said that the least squares fit to a process
where the slope is constrained to be zero is found
by taking the mean of the process.

Thus, if you wish to find the model that best
approximates the function

   y = a + b*x

where b is constrained to be zero, then you really
have the model

   y = a

since b was ZERO!!!!!!!

What is the value of a in a least squares sense? This
is simple.

  a = mean(y);

John

Subject: least-squares fit with constraint

From: Richard Startz

Date: 2 Dec, 2010 15:06:16

Message: 6 of 9

On Thu, 2 Dec 2010 11:19:05 +0000 (UTC), "John D'Errico"
<woodchips@rochester.rr.com> wrote:

>"gkk gkk" <gkkmath@comcast.net> wrote in message <id6uh9$3hn$1@fred.mathworks.com>...
>> Richard Startz <richardstartz@comcast.net> wrote in message <6fudf6d8e8njopfrs3gv76vq97ant76kms@4ax.com>...
>> > On Thu, 2 Dec 2010 01:28:05 +0000 (UTC), "gkk gkk"
>> > <gkkmath@comcast.net> wrote:
>> >
>> > >Hi, Is there a way in Matlab to perform a least-squares fit on a column vector, with the constraint that the slope is zero?
>> > >
>> > >I have a column vector of periods extracted from an periodic waveform, and I'm trying to find the average period, where "average period" is defined as that period having the best least-squares fit.
>> > >
>> > >If I run p=polyfit(x,y,1), it will return a line having a slope (albeit a small slope). Anyway to do something similar but constrain the LSF operation that the slope must be zero?
>> > >
>> > >Thanks in advance
>> >
>> > If the slope is zero, then that's the same as leaving the RHS variable
>> > out of the regression. The intercept is just the mean of the dependent
>> > variable.
>>
>> Thanks Richard, I'm not sure I understand. Are you saying if I enter:
>>
>> p=polyfit(x,y,1)
>>
>> and it returns:
>>
>> p(1) = 1.323e-12 (i.e. slope)
>> p(2) = 9.582e-8 (i.e. intercept)
>>
>> then p(2) is equivalent to performing LSF on (x,y) with the constraint that the slope must be zero? It seems counter-intuitive since the fit is made by minimizing the difference between the line and the data over the whole curve (e.g. I wouldn't necessarily expect the intercept from the fitted curve with a non-zero slope to be the best fit for a zero-slope line).
>
>No. He said that the least squares fit to a process
>where the slope is constrained to be zero is found
>by taking the mean of the process.
>
>Thus, if you wish to find the model that best
>approximates the function
>
> y = a + b*x
>
>where b is constrained to be zero, then you really
>have the model
>
> y = a
>
>since b was ZERO!!!!!!!
>
>What is the value of a in a least squares sense? This
>is simple.
>
> a = mean(y);
>
>John

John said it rather better than I did. :)

Subject: least-squares fit with constraint

From: gkk gkk

Date: 2 Dec, 2010 16:19:05

Message: 7 of 9

> >> Thanks Richard, I'm not sure I understand. Are you saying if I enter:
> >>
> >> p=polyfit(x,y,1)
> >>
> >> and it returns:
> >>
> >> p(1) = 1.323e-12 (i.e. slope)
> >> p(2) = 9.582e-8 (i.e. intercept)
> >>
> >> then p(2) is equivalent to performing LSF on (x,y) with the constraint that the slope must be zero? It seems counter-intuitive since the fit is made by minimizing the difference between the line and the data over the whole curve (e.g. I wouldn't necessarily expect the intercept from the fitted curve with a non-zero slope to be the best fit for a zero-slope line).
> >
> >No. He said that the least squares fit to a process
> >where the slope is constrained to be zero is found
> >by taking the mean of the process.
> >
> >Thus, if you wish to find the model that best
> >approximates the function
> >
> > y = a + b*x
> >
> >where b is constrained to be zero, then you really
> >have the model
> >
> > y = a
> >
> >since b was ZERO!!!!!!!
> >
> >What is the value of a in a least squares sense? This
> >is simple.
> >
> > a = mean(y);
> >
> >John
>
> John said it rather better than I did. :)

Thanks so much Richard and John, so I guess the improvement in correlation for p(2) as above compared to mean() that I observed must be due to machine rounding errors or other artifacts, as there's no technical reason why it should show improvement, right? The difference was only in the last 2 or 3 digits.

Subject: least-squares fit with constraint

From: John D'Errico

Date: 2 Dec, 2010 16:43:07

Message: 8 of 9

"gkk gkk" <gkkmath@comcast.net> wrote in message <id8gtp$inv$1@fred.mathworks.com>...
> > >> Thanks Richard, I'm not sure I understand. Are you saying if I enter:
> > >>
> > >> p=polyfit(x,y,1)
> > >>
> > >> and it returns:
> > >>
> > >> p(1) = 1.323e-12 (i.e. slope)
> > >> p(2) = 9.582e-8 (i.e. intercept)
> > >>
> > >> then p(2) is equivalent to performing LSF on (x,y) with the constraint that the slope must be zero? It seems counter-intuitive since the fit is made by minimizing the difference between the line and the data over the whole curve (e.g. I wouldn't necessarily expect the intercept from the fitted curve with a non-zero slope to be the best fit for a zero-slope line).
> > >
> > >No. He said that the least squares fit to a process
> > >where the slope is constrained to be zero is found
> > >by taking the mean of the process.
> > >
> > >Thus, if you wish to find the model that best
> > >approximates the function
> > >
> > > y = a + b*x
> > >
> > >where b is constrained to be zero, then you really
> > >have the model
> > >
> > > y = a
> > >
> > >since b was ZERO!!!!!!!
> > >
> > >What is the value of a in a least squares sense? This
> > >is simple.
> > >
> > > a = mean(y);
> > >
> > >John
> >
> > John said it rather better than I did. :)
>
> Thanks so much Richard and John, so I guess the improvement in correlation for p(2) as above compared to mean() that I observed must be due to machine rounding errors or other artifacts, as there's no technical reason why it should show improvement, right? The difference was only in the last 2 or 3 digits.

A difference in the last couple of significant digits?

Don't waste your time worrying about differences in the
least significant bits when you work in floating point
arithmetic.

The mean IS the least squares solution.

John

Subject: least-squares fit with constraint

From: gkk gkk

Date: 2 Dec, 2010 17:01:05

Message: 9 of 9

"John D'Errico" <woodchips@rochester.rr.com> wrote in message <id8iar$mcf$1@fred.mathworks.com>...
> "gkk gkk" <gkkmath@comcast.net> wrote in message <id8gtp$inv$1@fred.mathworks.com>...
> > > >> Thanks Richard, I'm not sure I understand. Are you saying if I enter:
> > > >>
> > > >> p=polyfit(x,y,1)
> > > >>
> > > >> and it returns:
> > > >>
> > > >> p(1) = 1.323e-12 (i.e. slope)
> > > >> p(2) = 9.582e-8 (i.e. intercept)
> > > >>
> > > >> then p(2) is equivalent to performing LSF on (x,y) with the constraint that the slope must be zero? It seems counter-intuitive since the fit is made by minimizing the difference between the line and the data over the whole curve (e.g. I wouldn't necessarily expect the intercept from the fitted curve with a non-zero slope to be the best fit for a zero-slope line).
> > > >
> > > >No. He said that the least squares fit to a process
> > > >where the slope is constrained to be zero is found
> > > >by taking the mean of the process.
> > > >
> > > >Thus, if you wish to find the model that best
> > > >approximates the function
> > > >
> > > > y = a + b*x
> > > >
> > > >where b is constrained to be zero, then you really
> > > >have the model
> > > >
> > > > y = a
> > > >
> > > >since b was ZERO!!!!!!!
> > > >
> > > >What is the value of a in a least squares sense? This
> > > >is simple.
> > > >
> > > > a = mean(y);
> > > >
> > > >John
> > >
> > > John said it rather better than I did. :)
> >
> > Thanks so much Richard and John, so I guess the improvement in correlation for p(2) as above compared to mean() that I observed must be due to machine rounding errors or other artifacts, as there's no technical reason why it should show improvement, right? The difference was only in the last 2 or 3 digits.
>
> A difference in the last couple of significant digits?
>
> Don't waste your time worrying about differences in the
> least significant bits when you work in floating point
> arithmetic.
>
> The mean IS the least squares solution.
>
> John

THANKS JOHN!

Tags for this Thread

What are tags?

A tag is like a keyword or category label associated with each thread. Tags make it easier for you to find threads of interest.

Anyone can tag a thread. Tags are public and visible to everyone.

Contact us