An object comprising training data, model description, diagnostic
information, and fitted coefficients for a linear regression. Predict
model responses with the `predict`

or `feval`

methods.

or `mdl`

=
fitlm(`tbl`

)

create
a linear model of a table or dataset array `mdl`

=
fitlm(`X`

,`y`

)`tbl`

,
or of the responses `y`

to a data matrix `X`

.
For details, see `fitlm`

.

or `mdl`

= stepwiselm(`tbl`

)

create
a linear model of a table or dataset array `mdl`

=
stepwiselm(`X`

,`y`

)`tbl`

,
or of the responses `y`

to a data matrix `X`

,
with unimportant predictors excluded. For details, see `stepwiselm`

.

addTerms | Add terms to linear regression model |

compact | Compact linear regression model |

dwtest | Durbin-Watson test of linear model |

fit | Create linear regression model |

plot | Scatter plot or added variable plot of linear model |

plotAdded | Added variable plot or leverage plot for linear model |

plotAdjustedResponse | Adjusted response plot for linear regression model |

plotDiagnostics | Plot diagnostics of linear regression model |

plotResiduals | Plot residuals of linear regression model |

removeTerms | Remove terms from linear model |

step | Improve linear regression model by adding or removing terms |

stepwise | Create linear regression model by stepwise regression |

anova | Analysis of variance for linear model |

coefCI | Confidence intervals of coefficient estimates of linear model |

coefTest | Linear hypothesis test on linear regression model coefficients |

disp | Display linear regression model |

feval | Evaluate linear regression model prediction |

plotEffects | Plot main effects of each predictor in linear regression model |

plotInteraction | Plot interaction effects of two predictors in linear regression model |

plotSlice | Plot of slices through fitted linear regression surface |

predict | Predict response of linear regression model |

random | Simulate responses for linear regression model |

Value. To learn how value classes affect
copy operations, see Copying Objects in
the MATLAB^{®} documentation.

The *hat matrix* *H* is
defined in terms of the data matrix *X*:

*H* = *X*(*X ^{T}X*)

The diagonal elements *h _{ii}* satisfy

$$\begin{array}{l}0\le {h}_{ii}\le 1\\ {\displaystyle \sum _{i=1}^{n}{h}_{ii}}=p,\end{array}$$

where *n* is the number of observations (rows
of *X*), and *p* is the number of
coefficients in the regression model.

The *leverage* of observation *i* is
the value of the *i*th diagonal term, *h*_{ii},
of the hat matrix *H*. Because the sum of the leverage
values is *p* (the number of coefficients in the
regression model), an observation *i* can be considered
to be an outlier if its leverage substantially exceeds *p*/*n*,
where *n* is the number of observations.

Cook's distance is the scaled change in fitted values.
Each element in `CooksDistance`

is the normalized
change in the vector of coefficients due to the deletion of an observation.
The Cook's distance, *D*_{i},
of observation *i* is

$${D}_{i}=\frac{{\displaystyle \sum _{j=1}^{n}{\left({\widehat{y}}_{j}-{\widehat{y}}_{j(i)}\right)}^{2}}}{p\text{\hspace{0.17em}}MSE},$$

where

$${\widehat{y}}_{j}$$ is the

*j*th fitted response value.$${\widehat{y}}_{j(i)}$$ is the

*j*th fitted response value, where the fit does not include observation*i*.*MSE*is the mean squared error.*p*is the number of coefficients in the regression model.

Cook's distance is algebraically equivalent to the following expression:

$${D}_{i}=\frac{{r}_{i}^{2}}{p\text{\hspace{0.17em}}MSE}\left(\frac{{h}_{ii}}{{\left(1-{h}_{ii}\right)}^{2}}\right),$$

where *r*_{i} is
the *i*th residual, and *h*_{ii} is
the *i*th leverage value.

`CooksDistance`

is an *n*-by-1
column vector in the `Diagnostics`

table of the `LinearModel`

object.

The main fitting algorithm is QR decomposition. For robust fitting,
the algorithm is `robustfit`

.

To remove redundant predictors in linear regression using lasso
or elastic net, use the `lasso`

function.

To regularize a regression with correlated terms using ridge
regression, use the `ridge`

or `lasso`

functions.

To regularize a regression with correlated terms using partial
least squares, use the `plsregress`

function.

Was this topic helpful?