Introduction to Probability and Statistics for Engineers and Scientists

(Sean Pound) #1

*7.7Evaluating a Point Estimator 271


Therefore,


E[d 2 ]=

∫θ

0

x

nxn−^1
θn

dx=

n
n+ 1

θ (7.7.1)

Also


E[d 22 ]=

∫θ

0

x^2

nxn−^1
θn

dx=

n
n+ 2

θ^2

and so


Var(d 2 )=

n
n+ 2

θ^2 −

(
n
n+ 1

θ

) 2
(7.7.2)

=nθ^2

[
1
n+ 2


n
(n+1)^2

]
=

nθ^2
(n+2)(n+1)^2

Hence


r(d 2 ,θ)=(E(d 2 )−θ)^2 +Var(d 2 ) (7.7.3)

=

θ^2
(n+1)^2

+

nθ^2
(n+2)(n+1)^2

=

θ^2
(n+1)^2

[
1 +

n
n+ 2

]

=

2 θ^2
(n+1)(n+2)

Since
2 θ^2
(n+1)(n+2)



θ^2
3 n

n=1, 2,...

it follows thatd 2 is a more superior estimator ofθthan isd 1.
Equation 7.7.1 suggests the use of even another estimator — namely, the unbiased
estimator (1+1/n)d 2 (X)=(1+1/n) maxiXi. However, rather than considering this
estimator directly, let us consider all estimators of the form


dc(X)=cmax
i

Xi=cd 2 (X)

wherecis a given constant. The mean square error of this estimator is


r(dc(X),θ)= Var(dc(X))+(E[dc(X)]−θ)^2
=c^2 Var(d 2 (X))+(cE[d 2 (X)]−θ)^2

=

c^2 nθ^2
(n+2)(n+1)^2

+θ^2

(
cn
n+ 1

− 1

) 2

by Equations 7.7.2 and 7.7.1 (7.7.4)
Free download pdf