Palgrave Handbook of Econometrics: Applied Econometrics

(Grace) #1

28 Methodology of Empirical Econometric Modeling


an empirical model corresponds to its encompassing the LDGP (so not deviating
significantly from it in any of the first five directions) (see Bontemps and Mizon,
2003). Testing the selected model against all extant models of the same variables
allows a rigorous evaluation of its “closeness” to the LDGP (see,inter alia, White,
1990; Mayo and Spanos, 2006).


Parameter dependence. Fourth, the resulting coefficients in (1.7) or (1.9) remain
dependent on the initial DGP parameters. If those DGP parameters change, induced
shifts can occur in the parameters of the LDGP. The extent to which these shifts
occur, and when they do so, whether they can be anticipated, modeled or even
understood, will depend on how usefully the reduced representation captures the
structure of the relevant sub-set of the economy under analysis. Here, “structure”
denotes invariance under extensions of the information set over (i) time (i.e., con-
stancy), (ii) regimes (i.e., changes to marginal distributions or policy variables) and
(iii) variables (so the reductions did not eliminate any important explanatory fac-
tors). When the initial economic analysis that led to the specification of{xt}(i.e.,
the transformed sub-set of data under analysis) actually captured the main features
of the behavior of the agents involved, thenρ 0 ,orκ 1 , should be an invariant that


also throws light on the agents’ decision parameters underlyingφ^1 Tin (1.2). Thus,
properly embedded in a general congruent model, the economics should carry
through.


Minimizing reductions. Finally, given the inertial dynamics of a high dimensional,
interdependent and non-stationary system like an economy, reductions seem likely
to be costly in practice and involve real information losses. These will manifest
themselves through non-constant models, empirical “puzzles” and poor forecasts,
so general systems seem generically preferable. “Errors” on empirical models are
created by reductions, so will be highly composite, reflecting many components.
It is unclear whether that also favors disaggregation, given problems posed by
measurement errors and heterogeneity difficulties as disaggregation increases, or
whether a “law of large numbers” may induce substantial offsets (as discussed
above).


1.4.2.5 Evaluating the three main approaches


We now consider how the three basic approaches fare against the above analysis.
Given their assumptions, each would of course work well; and with sufficiently
rigorous testing, the choice of approach becomes a matter of research efficiency (see
White, 1990). But efficiency is important, as the assumptions may not adequately
characterize reality, and rigorous attempts to reject are not always undertaken.


Imposing economic theory. First, if one simply imposes ana prioritheory on the data,
then the outcome will be excellent when the theory is complete (relative to the
issue under analysis) and “correct” (in that all omissions are relatively negligible).
Otherwise, it is difficult to ascertain in general how poor the outcome will be (see,
e.g., Juselius and Franchi, 2007). If no testing occurs, that strategy is both highly

Free download pdf