Mathematical Methods for Physics and Engineering : A Comprehensive Guide

(Darren Dugan) #1

STATISTICS


and describes the spread of valuesaˆaboutE[aˆ] that would result from a large


number of samples, each of sizeN. An estimator with a smaller variance is said


to be moreefficientthan one with a larger variance. As we show in the next


section, for any given quantityaof the population there exists a theoreticallower


limiton the variance ofanyestimatorˆa. This result is known asFisher’s inequality


(or theCram ́er–Rao inequality) and reads


V[aˆ]≥

(
1+

∂b
∂a

) 2 /
E

[

∂^2 lnP
∂a^2

]
, (31.17)

whereP stands for the populationP(x|a)andbis the bias of the estimator.


Denoting the quantity on the RHS of (31.17) byVmin,theefficiencyeof an


estimator is defined as


e=Vmin/V[aˆ].

An estimator for whiche= 1 is called aminimum-varianceorefficientestimator.


Otherwise, ife<1,aˆis called aninefficientestimator.


It should be noted that, in general, there is no unique ‘optimal’ estimatoraˆfor

a particular propertya. To some extent, there is always a trade-off between bias


and efficiency. One must often weigh the relative merits of an unbiased, inefficient


estimator against another that is more efficient but slightly biased. Nevertheless, a


common choice is thebest unbiased estimator(BUE), which is simply the unbiased


estimatorˆahaving the smallest varianceV[aˆ].


Finally, we note that some qualities of estimators are related. For example,

suppose thataˆis an unbiased estimator, so thatE[aˆ]=aandV[aˆ]→0as


N→∞. Using the Bienayme–Chebyshev inequality discussed in subsection 30.5.3, ́


it follows immediately thataˆis also a consistent estimator. Nevertheless, it does


notfollow that a consistent estimator is unbiased.


The sample valuesx 1 ,x 2 ,...,xNare drawn independently from a Gaussian distribution
with meanμand varianceσ. Show that the sample mean ̄xis a consistent, unbiased,
minimum-variance estimator ofμ.

We found earlier that the sampling distribution of ̄xis given by


P(x ̄|μ, σ)=

1



2 πσ^2 /N

exp

[



(x ̄−μ)^2
2 σ^2 /N

]


,


from which we see immediately thatE[x ̄]=μandV[ ̄x]=σ^2 /N. Thus ̄xis an unbiased
estimator ofμ. Moreover, since it is also true thatV[x ̄]→0asN→∞,x ̄is a consistent
estimator ofμ.
In order to determine whether ̄xis a minimum-variance estimator ofμ,wemustuse
Fisher’s inequality (31.17). Since the sample valuesxiare independent and drawn from a
Gaussian of meanμand standard deviationσ, we have


lnP(x|μ, σ)=−

1


2


∑N


i=1

[


ln(2πσ^2 )+

(xi−μ)^2
σ^2

]


,

Free download pdf