*7.7Evaluating a Point Estimator 267
DefinitionLetd=d(X) be an estimator of the parameterθ. Then
bθ(d)=E[d(X)]−θis called thebiasofdas an estimator ofθ.Ifbθ(d)=0 for allθ, then we say thatdis
anunbiasedestimator ofθ. In other words, an estimator is unbiased if its expected value
always equals the value of the parameter it is attempting to estimate.
EXAMPLE 7.7a LetX 1 ,X 2 ,...,Xnbearandomsamplefromadistributionhavingunknown
meanθ. Then
d 1 (X 1 ,X 2 ,...,Xn)=X 1and
d 2 (X 1 ,X 2 ,...,Xn)=X 1 +X 2 +···+Xn
n
are both unbiased estimators ofθsince
E[X 1 ]=E[
X 1 +X 2 +···+Xn
n]
=θMore generally,d 3 (X 1 ,X 2 ,...,Xn)=
∑n
∑n i=^1 λiXiis an unbiased estimator ofθwhenever
i= 1 λi=1. This follows since
E[ n
∑i= 1λiXi]
=∑ni= 1E[λiXi]=∑ni= 1λiE(Xi)=θ∑ni= 1λi=θ ■Ifd(X 1 ,...,Xn) is an unbiased estimator, then its mean square error is given by
r(d,θ)=E[(d(X)−θ)^2 ]=E[(d(X)−E[d(X)])^2 ] sincedis unbiased
=Var(d(X))Thus the mean square error of an unbiased estimator is equal to its variance.