The Mathematics of Financial Modelingand Investment Management

(Brent) #1

12-FinEcon-Model Sel Page 319 Wednesday, February 4, 2004 12:59 PM


Financial Econometrics: Model Selection, Estimation, and Testing 319

Recently, the theory of learning has been given a firm theoretical
basis by Vladimir Vapnik and Alexey Chervonenkis.^8 The Vapnik-Cher-
vonenkis (VC) theory of learning is a complex theoretical framework
for learning that, when applicable, is able to give precise theoretical
bounds to the learning abilities of models. The VC theory has been
applied in the context of nonlinear models thus originating the so-called
Support Vector Machines. Though its theoretical foundation is solid,
the practical applicability of the VC theory is complex. It has not found
yet a broad following in the world of econometrics.

MAXIMUM LIKELIHOOD ESTIMATE


Once the dimensionality of the model has been chosen, parameters need
to be estimated. This is the somewhat firmer ground of statistical esti-
mation. An estimator of a parameter is a statistic, that is, a function
computed on the sample data. For instance, the empirical average

n

x = ∑ xi

i = 1

of an n-sample is an estimator of the population mean. An estimator is
called unbiased if its expected value coincides with the theoretical
parameter. An estimator is called consistent if a sequence of estimators
computed on a sequence of samples whose size tends to infinity con-
verges to the true theoretical value of the parameter.
An estimator is a stochastic quantity when computed on a sample.
Given a model, the distribution of the estimator on samples of a given
size is determined and can be computed. Different estimators of the
same parameters will be characterized by different distributions when
computed on samples of the same size. The variance of the estimator’s
distribution is an indication of the quality of the approximation offered
by the estimator. An efficient estimator has the lowest possible variance.
A lower bound of an estimator variance is given by the Cramer-Rao
bound.
The Cramer-Rao bound is a theoretical lower bound to the accuracy
of estimates. It can be formulated as follows. Suppose that a population
sample X has a joint density f(x θ) that depends on a parameter θ and
that Y = g(X) is an unbiased estimator of θ. Y is a random variable that
depends on the sample. The Cramer-Rao bound prescribes a lower

(^8) Vapnik, Statistical Learning Theory.

Free download pdf