31.3 ESTIMATORS AND SAMPLING DISTRIBUTIONS
several estimators, however, it is usual to quote their full covariance matrix. This
M×Mmatrix has elements
Vij=Cov[aˆi,aˆj]=
∫
(ˆai−E[aˆi])(aˆj−E[aˆj])P(ˆa|a)dMaˆ
=
∫
(ˆai−E[aˆi])(aˆj−E[aˆj])P(x|a)dNx.
Fisher’s inequality can be generalised to the multi-dimensional case. Adapting
the proof given in subsection 31.3.2, one may show that, in the case where the
estimators are efficient and have zero bias, the elements of theinverseof the
covariance matrix are given by
(V−^1 )ij=E
[
−
∂^2 lnP
∂ai∂aj
]
, (31.36)
wherePdenotes the populationP(x|a) from which the sample is drawn. The
quantity on the RHS of (31.36) is the elementFijof the so-calledFisher matrix
Fof the estimators.
Calculate the covariance matrix of the estimators ̄xandsin the previous example.
As shown in (31.35), the joint sampling distributionP( ̄x, s|μ, σ) factorises, and so the
estimators ̄xandsare independent. Thus, we conclude immediately that
Cov[x, s ̄ ]=0.
Since we have already shown in the worked example at the end of subsection 31.3.1 that
V[ ̄x]=σ^2 /N, it only remains to calculateV[s]. From (31.35), we find
E[sr]=C 2
∫∞
0
sN−2+rexp
(
−
Ns^2
2 σ^2
)
ds=
(
2
N
)r/ 2
Γ
( 1
2 (N−1+r)
)
Γ
( 1
2 (N−1)
) σr,
where we have evaluated the integral using the definition of the gamma function given in
the Appendix. Thus, the expectation value of the sample standard deviation is
E[s]=
(
2
N
) 1 / 2
Γ
( 1
2 N
)
Γ
( 1
2 (N−1)
)σ, (31.37)
and its variance is given by
V[s]=E[s^2 ]−(E[s])^2 =
σ^2
N
N− 1 − 2
[
Γ
( 1
2 N
)
Γ
( 1
2 (N−1)
)
] 2
We note, in passing, that (31.37) shows thatsis abiasedestimator ofσ.
The idea of a confidence interval can also be extended to the case where several
quantities are estimated simultaneously but then the practical construction of an
interval is considerably more complicated. The general approach is to construct
anM-dimensionalconfidence regionRina-space. By analogy with the one-
dimensional case, for a given confidence level of (say) 1−α, one first constructs