Palgrave Handbook of Econometrics: Applied Econometrics

(Grace) #1

782 Computational Considerations in Microeconometrics


methods are used in theL 1 norm case, quantile regression (QR) being a leading
example.
Gradient methods use an iterative updating rule:
̂θs+ 1 =̂θs+Asgs, s=1,...,S, (15.1)


whereAs=A(̂θs)is aq×qmatrix, andgs=∂Qn(θ)/∂θ


∣∣
̂θsis theq×1 gradient vector
evaluated at̂θs. The iterative process continues until it converges. Convergence is
usually defined in terms of an arbitrarily small value of either|Qn(̂θs+ 1 )−Qn(̂θs)|
or|̂θs+ 1 −̂θs|. Gradient methods differ by their choice of matrixAs, as summarized
in Table 15.1. Some methods, most notably Newton–Raphson and the method
of scoring, use second derivatives of the objective function, whereas others, most
notably BHHH (Berndt–Hall–Hall–Hausman), is based on the gradient function
only. The derivatives may be computed using either the analytical expressions,
which are then programmed in the algorithm, or numerical derivatives, which is
often a default option. The numerical derivatives are computed using:


Qn(̂θs)
θj

=

Qn(̂θs+τej) −Qn(̂θs−τej)
2 τ

, j=1,...,q, (15.2)

Table 15.1 Some standard gradient-based iteration methods

A(̂θs) Reference


∂^2 Qn(θ)
∂θ∂θ$

∣∣
∣∣
∣̂θ
s

Newton–Raphson

−E

[
∂^2 Qn(θ)
∂θ∂θ$

]∣∣
∣∣
∣̂θ
s

Method of scoring


∑n
i= 1

∂qi(θ)
∂θ

∂qi(θ)
∂θ$

∣∣
∣∣
̂θs
Berndt–Hall–Hall–Hausman (BHHH)

Iq Steepest descent

As=As− 1 +

δs− 1 δ$s− 1
δ$s− 1 γs− 1
+

As− 1 γs$− 1 γs− 1 As− 1
γs$− 1 As− 1 γs− 1
, where Davidon–Fletcher–Powell (DFP)
δs− 1 =As− 1 gs− 1 ;γs− 1 =gs−gs− 1

As=As− 1 +

δs− 1 δ$s− 1
δ$s− 1 γs− 1

+

As− 1 γs$− 1 γs− 1 As− 1
γs$− 1 As− 1 γs− 1

Boyden–Fletcher–Goldfarb–Shannon
(BFGS)

−(γs$− 1 As− 1 γs− 1 )ηs− 1 η$s− 1 , where
ηs− 1 =(δs− 1 /δ$s− 1 γs− 1 )−(As− 1 γs− 1 /γs$− 1 As− 1 γs− 1 )
Free download pdf