Final_1.pdf

(Tuis.) #1

however, cases where the sum of an infinite sequence of numbers is actually
a finite value.^3 Let us denote the value of the time series at instant tas


yt=et+aet–1+a^2 et–2+... (2.5)

The infinite moving average representation above is called the MA(•) rep-
resentation. To simplify Equation 2.5, consider the value of the time series
att– 1. It is given as


yt–1=et–1+aet–2+a^2 et–3+... (2.6)

Examining the two equations, note that we can write ytin terms of yt–1as
follows:


yt=ayt–1+et (2.7)

In words, the value at time tis alpha times the value at time t– 1 plus a white
noise term. Note that alpha may be viewed as the slope of the regression be-
tween two consecutive values of the time series. Since the next value in the
time series is obtained by multiplying the past value with the slope of the
regression, it is called an autoregressive (AR) series. Figure 2.3a is the plot of
the AR time series, generated using the white noise values seen in Figure 2.1.
The corresponding correlogram is shown in Figure 2.3b. Notice that the
correlation values fall off gradually with increasing lag values; that is, there
is not much of a sharp drop. To get an insight into why that is, let us apply
the same kind of reasoning as we did for the MA model. Every time step has
in it additive terms comprising all the previous white noise realizations.
Therefore, there will always be white noise realizations that are common be-
tween two values of the time series however far apart they may be. Natu-
rally, we can expect there to be some correlation between any two values in
the time series regardless of the time interval between them. It is therefore
not surprising that the correlation exhibits a slow decay.
To answer the predictability question, here, too, as in the moving aver-
age case, knowledge of the past values of the time series is helpful in pre-
dicting what the next value is likely to be. In this case we have.
The conditional variance of the predicted value would be the variance of the
et, which is same as the variance of the white noise used to construct the time
series.
The one-step autoregressive series may be extended to an autoregressive
(AR) series of order p, denoted as AR(p). The value at time tis given as


yyttpred=α − 1

20 BACKGROUND MATERIAL


(^3) We touch upon this topic very briefly in the appendix. However, for a full-blown
discussion on stability analysis, we recommend that the reader follow up with the
references.

Free download pdf