*7.7Evaluating a Point Estimator 271
Therefore,
E[d 2 ]=∫θ0xnxn−^1
θndx=n
n+ 1θ (7.7.1)Also
E[d 22 ]=∫θ0x^2nxn−^1
θndx=n
n+ 2θ^2and so
Var(d 2 )=n
n+ 2θ^2 −(
n
n+ 1θ) 2
(7.7.2)=nθ^2[
1
n+ 2−n
(n+1)^2]
=nθ^2
(n+2)(n+1)^2Hence
r(d 2 ,θ)=(E(d 2 )−θ)^2 +Var(d 2 ) (7.7.3)=θ^2
(n+1)^2+nθ^2
(n+2)(n+1)^2=θ^2
(n+1)^2[
1 +n
n+ 2]=2 θ^2
(n+1)(n+2)Since
2 θ^2
(n+1)(n+2)
≤θ^2
3 nn=1, 2,...it follows thatd 2 is a more superior estimator ofθthan isd 1.
Equation 7.7.1 suggests the use of even another estimator — namely, the unbiased
estimator (1+1/n)d 2 (X)=(1+1/n) maxiXi. However, rather than considering this
estimator directly, let us consider all estimators of the form
dc(X)=cmax
iXi=cd 2 (X)wherecis a given constant. The mean square error of this estimator is
r(dc(X),θ)= Var(dc(X))+(E[dc(X)]−θ)^2
=c^2 Var(d 2 (X))+(cE[d 2 (X)]−θ)^2=c^2 nθ^2
(n+2)(n+1)^2+θ^2(
cn
n+ 1− 1) 2by Equations 7.7.2 and 7.7.1 (7.7.4)