Palgrave Handbook of Econometrics: Applied Econometrics

(Grace) #1
Terence C. Mills and Kerry Patterson xiii

and so empirical results are open to question; however, as Hendry shows, it is
possible to formalize a theory of applied econometrics which provides a coher-
ent basis for empirical work. Chapter 1 is a masterful and accessible synthesis and
extension of Hendry’s previous ideas and is likely to become compulsory reading
for courses in econometrics, both theory and applied; moreover, it is completed by
two applications using the Autometrics software (Doornik, 2007). The first appli-
cation extends the work of Magnus and Morgan (1999) on US food expenditure,
which was itself an update of a key study by Tobin (1950) estimating a demand
function for food. This application shows the Autometrics algorithm at work in a
simple context. The second application extends the context to a multiple equation
setting relating industrial output, the number of bankruptcies and patents, and real
equity prices. These examples illustrate the previously outlined theory of applied
econometrics combined with the power of the Autometrics software.
In Chapter 2, Fabio Canova addresses the question of how much structure there
should be in empirical models. This has long been a key issue in econometrics, and
some old questions, particularly those of identification and the meaning of struc-
ture, resurface here in a modern context. The last twenty years or so have seen
two key developments in macroeconometrics. One has been the development of
dynamic stochastic general equilibrium (DSGE) models. Initially, such models were
calibrated rather than estimated, with the emphasis on “strong” theory in their
specification; however, as Canova documents, more recently likelihood-based esti-
mation has become the dominant practice. The other key development has been
that of extending the (simple) vector autoregression (VAR) to the structural VAR
(SVAR) model. Although both approaches involve some structure, DSGE models,
under the presumption that the model is correct, rely more on an underlying the-
ory than do SVARs. So which should be used to analyze a particular set of problems?
As Canova notes: “When addressing an empirical problem with a finite amount of
data, one has...to take a stand on how much theory one wants to use to structure
the available data prior to estimation.” Canova takes the reader through the advan-
tages and shortcomings of these methodologies; he provides guidance on what to
do, and what not to do, and outlines a methodology that combines elements of
both approaches.
In Chapter 3, John DiNardo addresses some philosophical issues that are at
the heart of statistics and econometrics, but which rarely surface in economet-
ric textbooks. As econometricians, we are, for example, used to working within
a probabilistic framework, but we rarely consider issues related to what probabil-
ity actually is. To some degree, we have been prepared to accept the axiomatic
or measure theoretic approach to probability, due to Kolgomorov, and this has
provided us with a consistent framework that most are happy to work within.
Nevertheless, there is one well-known exception to this unanimity: when it comes
to the assignment and interpretation of probability measures and, in particular, the
interpretation of some key conditional probabilities; this is whether one adopts a
Bayesian or non-Bayesian perspective. In part, the debate that DiNardo discusses
relates to the role of the Bayesian approach, but it is more than this; it concerns
metastatistics and philosophy, because, in a sense, it relates to a discussion of the

Free download pdf