31.3 ESTIMATORS AND SAMPLING DISTRIBUTIONS
and, on differentiating twice with respect toμ, we find
∂^2 lnP
∂μ^2
=−
N
σ^2
.
This is independent of thexiand so its expectation value is also equal to−N/σ^2 .Withb
set equal to zero in (31.17), Fisher’s inequality thus states that, foranyunbiased estimator
μˆof the population mean,
V[μˆ]≥
σ^2
N
.
SinceV[ ̄x]=σ^2 /N, the sample mean ̄xis a minimum-variance estimator ofμ.
31.3.2 Fisher’s inequality
As mentioned above, Fisher’s inequality provides a lower limit on the variance of
anyestimatoraˆof the quantitya;itreads
V[aˆ]≥
(
1+
∂b
∂a
) 2 /
E
[
−
∂^2 lnP
∂a^2
]
, (31.18)
wherePstands for the populationP(x|a)andbis the bias of the estimator.
We now present a proof of this inequality. Since the derivation is somewhat
complicated, and many of the details are unimportant, this section can be omitted
on a first reading. Nevertheless, some aspects of the proof will be useful when
the efficiency of maximum-likelihood estimators is discussed in section 31.5.
Prove Fisher’s inequality (31.18).
The normalisation ofP(x|a)isgivenby
∫
P(x|a)dNx=1, (31.19)
wheredNx=dx 1 dx 2 ···dxNand the integral extends over all the allowed values of the
sample itemsxi. Differentiating (31.19) with respect to the parametera,weobtain
∫
∂P
∂a
dNx=
∫
∂lnP
∂a
PdNx=0. (31.20)
We note that the second integral is simply the expectation value of∂lnP/∂a,wherethe
average is taken over all possible samplesxi,i=1, 2 ,...,N.Further,byequatingthetwo
expressions for∂E[aˆ]/∂aobtained by differentiating (31.15) and (31.14) with respect toa
we obtain, dropping the functional dependencies, a second relationship,
1+
∂b
∂a
=
∫
aˆ
∂P
∂a
dNx=
∫
aˆ
∂lnP
∂a
PdNx. (31.21)
Now, multiplying (31.20) byα(a), whereα(a)isanyfunction ofa, and subtracting the
result from (31.21), we obtain
∫
[aˆ−α(a)]
∂lnP
∂a
PdNx=1+
∂b
∂a
.
At this point we must invoke the Schwarz inequality proved in subsection 8.1.3. The proof