Palgrave Handbook of Econometrics: Applied Econometrics

(Grace) #1

xxii Editors’ Introduction


by parametric restrictions seeking to classify some variables as “exogenous,” a task
that some have regarded as misguided (or indeed even “impossible”). Further, a
failure to assess the validity of the reduction process in going from the (unknown)
data-generating process to a statistical representation, notwithstanding criticisms
related to structural identification, stored up nascent empirical failure awaiting the
macreconometric model. Developments in cointegration theory and practice have
“tightened” up the specification of empirical macromodels, and DSGE models, pre-
ferred theoretically by some, have provided an alternative “modellus operandi.”
Subsequently, the quasi-independence of some central banks has heightened the
practical importance of questions such as “How should a central bank respond to
shocks in macroeconomic variables?” (Favero, Chapter 16). In practice, although
DSGE models are favored for policy analysis, in their empirical form the VAR
reappears, but with their own set of issues. Favero considers such practical develop-
ments as calibration and model evaluation, the identification of shocks, impulse
responses, structural stability of the parameters, VAR misspecification and factor
augmented VARs. A summary and analysis of Sims’ (2002) small macroeconomic
model (Appendix A) helps the reader to understand the relationship between an
optimizing specification and the resultant VAR model.
In Chapter 17, Gunnar Bårdsen and Ragnar Nymoen provide a paradigm for
the construction of a dynamic macroeconometric model, which is then illus-
trated with a small econometric model of the Norwegian economy that is used
for policy analysis. Bårdsen and Nymoen note the two central critiques of “failed”
macroeconometric models: the Lucas (1976) critique and the Clements and Hendry
(1999) analysis of forecast failure involving “location” shifts (rather than behav-
ioral parameter shifts). But these critiques have led to different responses; first, the
move to explicit optimizing models (see Chapter 16); and, alternatively, to greater
attention to the effects of regime shifts, viewing the Lucas critique as a possibility
theorem rather than a truism (Ericsson and Irons, 1995). Whilst it is de rigueur
to accept that theory is important, Bårdsen and Nymoen consider whether “the-
ory” provides the (completely) correct specification or whether it simply provides a
guideline for the specification of an empirical model. In their approach, the under-
lying economic model is nonlinear and specified in continuous time; hence, the
first practical steps are linearization and discretization, which result in an equilib-
rium correction model (EqCM). Rather than remove the data trends, for example
by applying the HP filter, the common trends are accounted for through a cointe-
gration analysis. The approach is illustrated step by step by building a small-scale
econometric model of the Norwegian economy, which incorporates the ability to
analyze monetary policy; for example, an increase in the market rate, which shows
the channels of the operation of monetary policy. Further empirical analysis of the
New Keynesian Phillips curve provides an opportunity to illustrate their approach
in another context. In summary, Bårdsen and Nymoen note that cointegration
analysis takes into account non-stationarities that arise through unit roots, so that
forecast failures are unlikely to be attributable to misspecification for that reason.
In contrast to the econometric models of the 1970s, the real challenges arise from
non-stationarities in functional relationships due to structural breaks; however,

Free download pdf