|On this page…|
Many observed time series exhibit serial autocorrelation; that is, linear association between lagged observations. This suggests past observations might predict current observations. The autoregressive (AR) process models the conditional mean of yt as a function of past observations, . An AR process that depends on p past observations is called an AR model of degree p, denoted by AR(p).
where is an uncorrelated innovation process with mean zero.
The signs of the coefficients in the AR lag operator polynomial, , are opposite to the right side of Equation 5-6. When specifying and interpreting AR coefficients in Econometrics Toolbox, use the form in Equation 5-6.
Consider the AR(p) model in lag operator notation,
is the unconditional mean of the process, and is an infinite-degree lag operator polynomial, .
Note: The Constant property of an arima model object corresponds to c, and not the unconditional mean μ.
By Wold's decomposition , Equation 5-8 corresponds to a stationary stochastic process provided the coefficients are absolutely summable. This is the case when the AR polynomial, , is stable, meaning all its roots lie outside the unit circle.
Econometrics Toolbox enforces stability of the AR polynomial. When you specify an AR model using arima, you get an error if you enter coefficients that do not correspond to a stable polynomial. Similarly, estimate imposes stationarity constraints during estimation.
 Wold, H. A Study in the Analysis of Stationary Time Series. Uppsala, Sweden: Almqvist & Wiksell, 1938.