Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
5.1. Convergence in Probability 325

0 500 1000 1500 2000

−0.2

−0.1

0.0

0.1

0.2

0.3

n

xbar

Figure 5.1.1:Realizations of the point estimatorXfor samples of size 10 to 2000
in steps of 10 which are drawn from aN(0,1) distribution.


sample variance is an unbiased estimator ofσ^2. We now show that it is a consistent


estimator ofσ^2. Recall Theorem 5.1.1 which shows thatXn
P
→μ. To show that the
sample variance converges in probability toσ^2 , assume further thatE[X 14 ]<∞,so
that Var(S^2 )<∞. Using the preceding results, we can show the following:


S^2 n=

1
n− 1

∑n

i=1

(Xi−Xn)^2 =

n
n− 1

(
1
n

∑n

i=1

Xi^2 −X
2
n

)

→P 1 ·[E(X^2
1 )−μ

(^2) ]=σ (^2).
Hence the sample variance is a consistent estimator ofσ^2. From the discussion
above, we have immediately thatSn
P
→σ; that is, the sample standard deviation is
a consistent estimator of the population standard deviation.
Unlike the last example, sometimes we can obtain the convergence by using the
distribution function. We illustrate this with the following example:
Example 5.1.2(Maximum of a Sample from a Uniform Distribution). Suppose
X 1 ,...,Xnis a random sample from a uniform(0,θ) distribution. Supposeθis
unknown. An intuitive estimate ofθis the maximum of the sample. LetYn=
max{X 1 ,...,Xn}. Exercise 5.1.4 shows that the cdf ofYnis
FYn(t)=



(^1) ( t>θ
t
θ
)n
0 <t≤θ
0 t≤ 0.
(5.1.1)

Free download pdf