Let us remark that it is not necessary to consider m consecutive moment
equations as indicated by Equations (9.58); any convenient set of m equations that
lead to the solution for 1, ... , m, is sufficient. Lower-order moment equa-
tions are preferred, however, since they require less manipulation of observed data.
An attractive feature of the method of moments is that the moment equations
are straightforward to establish, and there is seldom any difficulty in solving
them. H owever, a shortcoming is that such desirable properties as unbiasedness
or efficiency are not generally guaranteed for estimators so obtained.
However, consistency of moment estimators can be established under general
conditions. In order to show this, let us consider a single parameter whose
moment estimator satisfies the moment equation
for some i. The so lution of Equation (9.59) for^ can be represented by
= (Mi), for which the Taylor’s expansion about gives
where superscript (k) denotes the kth derivative with respect to Mi. Upon
performing su ccessive differentiations of Equation (9.59) with respect to Mi,
Equation (9.60) becomes
The bias and variance of can be found by taking the expectation of
Equation (9.61) and the expectation of the square of Equation (9.61), respect-
ively. Up to the order of 1/n, we find
Assuming that all the indicated moments and their derivatives exist, Equations
(9.62) show that
Parameter Estimation 279
^j,j
^
(^) i
^Mi;
9 : 59
^
^ ^ (^) i)
^^
^1 (^) i
Mi (^) i
^
^2 (^) i
2!
Mi (^) i
^2 ;
9 : 60
^Mi (^) i
d^ i
d
1
1
2
Mi (^) i
^2
d^2 i
d^2
d (^) i
d
3
:
9 : 61
^
Ef^g
1
2 n
(^2) i (^2) i
d^2 i
d^2
d (^) i
d
3
;
varf^g
1
n
(^2) i (^2) i
d (^) i
d
2
:
9
>>
>>
>=
>>
>>>
;
9 : 62
lim
n!1
Ef^g;