Final_1.pdf

(Tuis.) #1

The Akaike information criterion (AIC) quantifies the preceding trade-
off argument.^5 In general, every model with kparameters is associated with
an AIC number as follows:


(2.13)


whereeiis the forecast error on the ith data point. Here, the first term rep-
resents the goodness of fit, and the second term is the bias. For every addi-
tional variable, the second term increases by a value of 2. However, when a
variable is added, we expect the fit to improve and the variance of the fore-
cast error to go down. If this reduction is more than 2, then the AIC value
for the model with an additional variable will be lower, and we will have got
our proverbial bang for the buck. If the value is higher, then the trade-off is
not worth it, and we stick with the current model.
The rationale for the AIC formula and the quantitative value used for
trade-off has a strong foundation in information theory and is far from
arbitrary. Further follow-up material on this can be found in the reference
section.


Example


The application of the AIC idea is illustrated in the following exercise. An
AR(3) time series that was generated is shown in Figure 2.5a. AR models of
various orders were fit to it and the AIC values calculated. The result is plot-
ted in Figure 2.5b. The x-axis denotes the number of parameters in the AR


AIC=







+


=

n ∑
e
n

i k
i

n
log

2

1

2


Time Series 27


FIGURE 2.5A AR(3) Series.

020406080

–4


–2


0

2

4

100

(^5) AIC is but one of many cost functions. The Schwartz information criterion (SIC)
and the Bayesian information criterion (BIC) are also popular.

Free download pdf