STATISTICS
However, since the sample valuesxiare assumed to be independent, we have
E[xrixrj]=E[xri]E[xrj]=μ^2 r. (31.52)
The number of terms in the sum on the RHS of (31.51) isN(N−1), and so we find
V[mr]=
1
N
μ 2 r−μ^2 r+
N− 1
N
μ^2 r=
μ 2 r−μ^2 r
N
. (31.53)
SinceE[mr]=μrandV[mr]→0asN→∞,therth sample momentmris also
a consistent estimator ofμr.
Find the covariance of the sample momentsmrandmsfor a sample of sizeN.
We obtain the covariance of the sample momentsmrandmsin a similar manner to that
used above to obtain the variance ofmr. From the definition of covariance, we have
Cov[mr,ms]=E[(mr−μr)(ms−μs)]
=
1
N^2
E
[(
∑
i
xri−Nμr
)(
∑
j
xsj−Nμs
)]
=
1
N^2
E
∑
i
xri+s+
∑
i
∑
j=i
xrixsj−Nμr
∑
j
xsj−Nμs
∑
i
xri+N^2 μrμs
Assuming thexito be independent, we may again use result (31.52) to obtain
Cov[mr,ms]=
1
N^2
[Nμr+s+N(N−1)μrμs−N^2 μrμs−N^2 μsμr+N^2 μrμs]
=
1
N
μr+s+
N− 1
N
μrμs−μrμs
=
μr+s−μrμs
N
.
We note that by settingr=s, we recover the expression (31.53) forV[mr].
31.4.5 Population central momentsνr
We may generalise the discussion of estimators for the second central momentν 2
(or equivalentlyσ^2 ) given in subsection 31.4.2 to the estimation of therth central
momentνr. In particular, we saw in that subsection that our choice of estimator
forν 2 depended on whether the population meanμ 1 is known; the same is true
for the estimation ofνr.
Let us first consider the case in whichμ 1 is known. From (30.54), we may write
νras
νr=μr−rC 1 μr− 1 μ 1 +···+(−1)krCkμr−kμk 1 +···+(−1)r−^1 (rCr− 1 −1)μr 1.
Ifμ 1 is known, a suitable estimator is obviously
ˆνr=mr−rC 1 mr− 1 μ 1 +···+(−1)krCkmr−kμk 1 +···+(−1)r−^1 (rCr− 1 −1)μr 1 ,
wheremris therth sample moment. Sinceμ 1 and the binomial coefficients are