David T. Jacho-Chávez and Pravin K. Trivedi 781
Fourth, computer-intensive resampling methods, such as the bootstrap and jack-
knife, are increasingly used as substitutes for analytically complex computations
such as sample estimates of asymptotic variances.
Finally, cross-validation is another computer-intensive tool that is useful not
only for parameter tuning (as illustrated in section 15.4) but also for model
evaluation and comparison.
In the remainder of this section we define and outline each type of application.
15.3.1 Data summary and visualization
A starting point of almost any empirical microeconometric application involves
providing data summaries. The most common manifestation of this takes the form
of a table of sample moments, such as means, variances, skewness and kurtosis.
However, visual data summaries, such as histograms and kernel (marginal) density
plots, are often a more efficient way of providing information (see, e.g., Huynh
and Jacho-Chávez, 2007). The kernel density estimator is a generalization of the
histogram estimate usingkernelweights,k(·), that integrate to 1. These weights
depend on a smoothing parameter,h, called thebandwidth, and 2his thewindow
width(see section 15.4.1 for definitions and examples). Givenk(·)andhthe esti-
mator is easy to implement, and if the estimator is evaluated atrdistinct values,
then computation of the kernel estimator requires at mostnroperations when
the kernel has unbounded support. However, as is evident even in the most ele-
mentary computation of a histogram, constructing such visual displays involves
the choice of bin size, and the results may be sensitive to the choice ofh. This
motivates the search for another estimator that treatshlike an unknown param-
eter, i.e., cross-validation. Such a method is inherently more computer-intensive,
as will be shown in section 15.4. An extension of this concept, also considered in
the next section, is the estimator of the conditional probability density function,
which also involves considerations similar to those in the estimation of marginal
densities.
15.3.2 Numerical optimization
Microeconometrics frequently employs an estimator̂θthat maximizes a stochas-
tic objective functionQn(θ), where usuallŷθsolves the first-order conditions
∂Qn(θ)/∂θ= 0 ;nbeing the sample size. The objective function may be a likelihood
for parametric models, a weighted sum of squares function for semiparametric
models, or a linear function subject to inequality restrictions when the objective
function has anL 1 (e.g., least absolute deviations) rather than anL 2 (e.g., sum
of squared residuals) norm. For many nonlinear models there is no closed-form
solution of the first-order conditions, only a nonlinear system of equations in
the unknownθ. Estimation algorithms use iterative methods to solve the first-
order conditions. Iterative methods involve an updating rule for obtaining a new
estimate,̂θs+ 1 , given a current estimatêθs. Historically, iterative procedures consti-
tuted a computational challenge, but now they are standard. When the objective
function is in theL 2 norm, gradient methods are most common. Non-gradient