Note: This page has been translated by MathWorks. Click here to see

To view all translated materials including this page, select Country from the country navigator on the bottom of this page.

To view all translated materials including this page, select Country from the country navigator on the bottom of this page.

Heteroscedasticity and autocorrelation consistent covariance estimators

`EstCov = hac(X,y)`

`EstCov = hac(Tbl)`

`EstCov = hac(Mdl)`

`EstCov = hac(___,Name,Value)`

```
[EstCov,se,coeff]
= hac(___)
```

returns
robust covariance estimates for ordinary least squares (OLS) coefficient
estimates of multiple linear regression models `EstCov`

= hac(`X`

,`y`

)`y`

= `X`

*β* + *ε* under
general forms of heteroscedasticity and autocorrelation in the innovations
process *ε*.

`NaN`

s in the data indicate missing values,
which `hac`

removes using list-wise deletion. `hac`

sets `Data`

= ```
[X
y]
```

, then it removes any row in `Data`

containing
at least one `NaN`

. This reduces the effective sample
size, and changes the time base of the series.

returns robust covariance estimates for OLS coefficient estimates
of multiple linear regression models, with predictor data, `EstCov`

= hac(`Tbl`

)`X`

,
in the first `numPreds`

columns of the tabular array, `Tbl`

,
and response data, `y`

, in the last column.

`hac`

removes all missing values in `Tbl`

,
indicated by `NaN`

s, using list-wise deletion. In
other words, `hac`

removes all rows in `Tbl`

containing
at least one `NaN`

. This reduces the effective sample
size, and changes the time base of the series.

uses
any of the input arguments in the previous syntaxes and additional
options that you specify by one or more `EstCov`

= hac(___,`Name,Value`

)`Name,Value`

pair
arguments.

For example, use `Name,Value`

pair arguments
to choose weights for HAC or HC estimators, set a bandwidth for a
HAC estimator, or prewhiten the residuals.

[2] recommends prewhitening for HAC estimators
to reduce bias. The procedure tends to increase estimator variance
and mean-squared error, but can improve confidence interval coverage
probabilities and reduce the over-rejection of *t* statistics.

The original White HC estimator, specified by

`'type','HC','weights','HC0'`

, is justified asymptotically. The other`weights`

values,`HC1`

,`HC2`

,`HC3`

, and`HC4`

, are meant to improve small-sample performance. [6] and [3] recommend using`HC3`

and`HC4`

, respectively, in the presence of influential observations.HAC estimators formed using the truncated kernel might not be positive semidefinite in finite samples. [10] proposes using the Bartlett kernel as a remedy, but the resulting estimator is suboptimal in terms of its rate of consistency. The quadratic spectral kernel achieves an optimal rate of consistency.

The default estimation method for HAC bandwidth selection is

`AR1MLE`

. It is generally more accurate, but slower, than the AR(1) alternative,`AR1OLS`

. If you specify`'bandwidth','ARMA11'`

, then`hac`

estimates the model using maximum likelihood.Bandwidth selection models might exhibit sensitivity to the relative scale of the predictors in

`X`

.

[1] Andrews, D. W. K. “Heteroskedasticity
and Autocorrelation Consistent Covariance Matrix Estimation.” *Econometrica*.
Vol. 59, 1991, pp. 817–858.

[2] Andrews, D. W. K., and J. C. Monohan. “An
Improved Heteroskedasticity and Autocorrelation Consistent Covariance
Matrix Estimator.” *Econometrica*. Vol.
60, 1992, pp. 953–966.

[3] Cribari-Neto, F. "Asymptotic Inference
Under Heteroskedasticity of Unknown Form." *Computational
Statistics & Data Analysis*. Vol. 45, 2004, pp. 215–233.

[4] den Haan, W. J., and A. Levin. "A Practitioner's
Guide to Robust Covariance Matrix Estimation." In *Handbook
of Statistics*. Edited by G. S. Maddala and C. R. Rao.
Amsterdam: Elsevier, 1997.

[5] Frank, A., and A. Asuncion. UCI Machine Learning Repository. Irvine, CA: University of California, School of Information and Computer Science. https://archive.ics.uci.edu/ml, 2012.

[6] Gallant, A. R. *Nonlinear Statistical
Models*. Hoboken, NJ: John Wiley & Sons, Inc., 1987.

[7] Kutner, M. H., C. J. Nachtsheim, J. Neter,
and W. Li. *Applied Linear Statistical Models*.
5th ed. New York: McGraw-Hill/Irwin, 2005.

[8] Long, J. S., and L. H. Ervin. "Using Heteroscedasticity-Consistent
Standard Errors in the Linear Regression Model." *The American
Statistician*. Vol. 54, 2000, pp. 217–224.

[9] MacKinnon, J. G., and H. White. "Some Heteroskedasticity-Consistent
Covariance Matrix Estimators with Improved Finite Sample Properties." *Journal
of Econometrics*. Vol. 29, 1985, pp. 305–325.

[10] Newey, W. K., and K. D. West. "A Simple,
Positive-Definite, Heteroskedasticity and Autocorrelation Consistent
Covariance Matrix." *Econometrica*. Vol. 55,
1987, pp. 703–708.

[11] Newey, W. K, and K. D. West. “Automatic
Lag Selection in Covariance Matrix Estimation.” *The
Review of Economic Studies*. Vol. 61 No. 4, 1994, pp. 631–653.

[12] White, H. "A Heteroskedasticity-Consistent
Covariance Matrix and a Direct Test for Heteroskedasticity." *Econometrica*.
Vol. 48, 1980, pp. 817–838.

[13] White, H. *Asymptotic Theory for
Econometricians*. New York: Academic Press, 1984.