Introduction to Probability and Statistics for Engineers and Scientists

(Sean Pound) #1

270 Chapter 7: Parameter Estimation


EXAMPLE 7.7c LetX 1 ,...,Xndenote a sample from a uniform (0,θ) distribution, where
θis assumed unknown. Since


E[Xi]=

θ
2
a “natural” estimator to consider is the unbiased estimator


d 1 =d 1 (X)=

2

∑n
i= 1

Xi

n

SinceE[d 1 ]=θ, it follows that


r(d 1 ,θ)=Var(d 1 )

=

4
n

Var(Xi)

=

4
n

θ^2
12

since Var(Xi)=

θ^2
12

=

θ^2
3 n
A second possible estimator ofθis the maximum likelihood estimator, which, as shown
in Example 7.2d, is given by


d 2 =d 2 (X)=max
i

Xi

To compute the mean square error ofd 2 as an estimator ofθ, we need to first compute
its mean (so as to determine its bias) and variance. To do so, note that the distribution
function ofd 2 is as follows:


F 2 (x)≡P{d 2 (X)≤x}
=P{max
i

Xi≤x}

=P{Xi≤x for alli=1,...,n}

=

∏n

i= 1

P{Xi≤x} by independence

=

(x
θ

)n
x≤θ

Hence, upon differentiating, we obtain that the density function ofd 2 ,is


f 2 (x)=

nxn−^1
θn

,x≤θ
Free download pdf