6.2. Rao–Cram ́er Lower Bound and Efficiency 363
(R4)The integral
∫
f(x;θ)dxcan be differentiated twice under the integral sign as
afunctionofθ.
Note that conditions (R1)–(R4) mean that the parameterθdoes not appear
in the endpoints of the interval in whichf(x;θ)>0 and that we can interchange
integration and differentiation with respect toθ. Our derivation is for the continuous
case, but the discrete case can be handled in a similar manner. We begin with the
identity
1=
∫∞
−∞
f(x;θ)dx.
Taking the derivative with respect toθresults in
0=
∫∞
−∞
∂f(x;θ)
∂θ
dx.
The latter expression can be rewritten as
0=
∫∞
−∞
∂f(x;θ)/∂θ
f(x;θ)
f(x;θ)dx,
or, equivalently,
0=
∫∞
−∞
∂logf(x;θ)
∂θ
f(x;θ)dx. (6.2.1)
Writing this last equation as an expectation, we have established
E
[
∂logf(X;θ)
∂θ
]
= 0; (6.2.2)
that is, the mean of the random variable∂log∂θf(X;θ)is 0. If we differentiate (6.2.1)
again, it follows that
0=
∫∞
−∞
∂^2 logf(x;θ)
∂θ^2
f(x;θ)dx+
∫∞
−∞
∂logf(x;θ)
∂θ
∂logf(x;θ)
∂θ
f(x;θ)dx.(6.2.3)
The second term of the right side of this equation can be written as an expectation,
which we callFisher informationand we denote it byI(θ); that is,
I(θ)=
∫∞
−∞
∂logf(x;θ)
∂θ
∂logf(x;θ)
∂θ
f(x;θ)dx=E
[(
∂logf(X;θ)
∂θ
) 2 ]
. (6.2.4)
From equation (6.2.3), we see thatI(θ) can be computed from
I(θ)=−
∫∞
−∞
∂^2 logf(x;θ)
∂θ^2
f(x;θ)dx=−E
[
∂^2 logf(X;θ)
∂θ^2
]
. (6.2.5)
Using equation (6.2.2), Fisher information is the variance of the random variable
∂logf(X;θ)
∂θ ; i.e.,
I(θ)=Var
(
∂logf(X;θ)
∂θ
)
. (6.2.6)
Usually, expression (6.2.5) is easier to compute than expression (6.2.4).