Model Estimation 289
the data and the parameters. These conditions typically take the form of
equations where the expectation of given functions is equated to zero:
Eh it()X,β= (^0)
In our illustrations, these conditions were the conditions that defined the
first three moments of a normal variable.
The GMM replaces these conditions with averages and constructs the
vector:
=β∑ ()
g
T
hX
1
iit,
t
T
1
where T is the number of available samples and determines β minimizing the
quadratic form: QY()μσ,, =gW' g.
The M-Estimation Method and M-Estimators
Both LS and ML methods are based on minimizing/maximizing a function of
the data. This approach has been generalized. The M-estimators are estimators
obtained by maximizing given functions of the data and parameters. This gen-
eralization proved fruitful in the field of robust statistics, which is described in
Appendix F and in Chapter 8 on robust regressions. In fact, by choosing appro-
priate functions to be minimized, estimation can give less weight to observa-
tions that fall very far from the mean, thereby making estimators robust.
Key Points
■ (^) Inferential statistics infer the properties of a population from a sample.
■ (^) Estimation is a set of methods to determine population parameters from
a sample. An estimator is a function of sample data.
■ (^) The estimation methods commonly used in financial econometrics are
the least squares method, maximum likelihood method, method of
moments, and Bayesian method.
■ (^) The least squares estimation method estimates parameters by minimiz-
ing the sum of squared residuals.
■ (^) Least squares estimators of standard regressions, called ordinary least
squares (OLS) estimators, are linear functions of the sample data.
■ (^) Least squares estimators of regressions are the best linear unbiased
estimators.