Autoregressive Heteroscedasticity Model and Its Variants 219
predict the value of future variance of returns in function of past returns
(i.e., conditional on previous returns).
However, it has been demonstrated that—and this is a key point—
a sequence can be conditionally heteroscedastic though it is stationary.
Though it might seem counterintuitive, a sequence might be stationary, with
constant unconditional variance, and still its observed values (referred to as
realizations) might exhibit periods when fluctuations are larger and periods
when fluctuations are smaller. That is, the variance changes in time condi-
tionally to previous values of the series.
Modeling Arch Behavior
Robert Engle first introduced a model of conditional heteroscedasticity
in 1982. Engle’s choice was to model conditional heteroscedasticity as an
autoregressive process, hence the model is called autoregressive conditional
heteroscedasticity.
To explain how to model ARCH behavior, we will use asset returns
(although the modeling can be applied to any financial variable), denoting
the return at time t by Rt. Assume that the behavior of asset returns Rt is
described by the following process:
Rtt=σεt
(11.1)
where σt is the standard deviation of the return at time t and εt is a sequence
of independent normal variables with mean zero and variance one. In equa-
tion (11.1) we assume that returns have mean zero or, realistically, that a
constant mean has been subtracted from returns.
The simplest ARCH model requires that the following relationship
holds:
σtt^2 =+ca 11 R^2 − (11.2)
where σt^2 is the variance of the return at time t and c and a 1 are constants
to be determined via estimation. In plain English, this model states that the
variance of an asset’s return at any time t depends on a constant term plus
the product of a constant term and the square of the previous time period’s
return. Because this model involves the return for one prior time period (i.e.,
one-period lag), it is referred to as an ARCH(1) model where (1) denotes a
one-period lag.
We need to impose conditions on the parameters c and a 1 to ensure that
the variance σt^2 is greater than 0 and that the returns Rt are stationary. A
process that assumes only values greater than 0 is called a positive process.