278 The Basics of financial economeTrics
It can be demonstrated that the estimator of the regression coefficients
becomes
bX=()VX− XVy
− −
''^1
(^11)
(13.10)
and that this WLS estimator is unbiased and BLUE. If the residuals are
homoscedastic, the matrix V becomes the identity matrix and the estimator
b becomes the usual expression given by equation (13.5).
generalized least Squares Method
The WLS estimator can be generalized to the generalized least squares (GLS)
estimator. The GLS estimator applies when residuals are both heteroscedas-
tic and autocorrelated. In this case, equation (13.7) becomes
=σ
=
WV
V
vv
vv
n
nnn
2
11
2
1
1
2
(13.11)
where V is now a full symmetric covariance matrix. The GLS estimator
of the regression parameters, conditional on the realization X, is given by
the same expression as in equation (13.10) but where now V is a full cova-
riance matrix. The GLS estimator is linear, unbiased, and has minimum
variance among the linear estimators.
The theory of WLS and that of GLS assume that the covariance matrix of
residuals is perfectly known. This is the key limitation in applying WLS and GLS
in practice. In practice, we have to estimate the covariance matrix of residuals.
In an ad hoc iterative procedure, we first estimate a regression and residuals
with OLS, then estimate the covariance matrix of residuals, and lastly estimate
a new regression and relative residuals with GLS, and proceed iteratively.
The Maximum Likelihood Estimation Method
The maximum likelihood (ML) estimation method involves maximizing the
likelihood of the sample given an assumption of the underlying distribu-
tion (for example, that it is a normal distribution or a uniform distribu-
tion). Likelihood is the distribution computed for the sample. In order to
apply ML estimation methods we must know the functional form of the