Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
294 Some Elementary Statistical Inferences

n 100 500 1000 10,000 100,000
4 x 3.24 3.072 3.132 3.138 3.13828
1. 96 · 4


x(1−x)/n 0.308 0.148 0.102 0.032 0.010

We can use the large sample confidence interval derived in Section 4.2 to estimate
the error of estimation. The corresponding 95% confidence interval forπis
(
4 x− 1. 96 · 4


x(1−x)/n, 4 x+1. 96 · 4


x(1−x)/n

)

. (4.8.1)


The last row of the above table contains the error part of the confidence intervals.
Notice that all five confidence intervals trapped the true value ofπ.


What about continuous random variables? For these we have the following
theorem:


Theorem 4.8.1.Suppose the random variableUhas a uniform(0,1)distribution.
LetFbe a continuous distribution function. Then the random variableX=F−^1 (U)
has distribution functionF.


Proof: Recall from the definition of a uniform distribution thatUhas the distri-
bution functionFU(u)=uforu∈(0,1). Using this, the distribution-function
technique, and assuming thatF(x) is strictly monotone, the distribution function
ofXis


P[X≤x]=P[F−^1 (U)≤x]
= P[U≤F(x)]
= F(x),

which proves the theorem.


In the proof, we assumed thatF(x) was strictly monotone. As Exercise 4.8.13
shows, we can weaken this.
We can use this theorem to generate realizations (observations) of many different
random variables. For example, supposeXhas the Γ(1,β)-distribution. Suppose
we have a uniform generator and we want to generate a realization ofX.The
distribution function ofXis


F(x)=1−e−x/β,x> 0.

Hence the inverse of the distribution function is given by

F−^1 (u)=−βlog(1−u), 0 <u< 1. (4.8.2)

So ifUhas the uniform (0,1) distribution, thenX=−βlog(1−U) has the Γ(1,β)-
distribution. For instance, supposeβ= 1 and our uniform generator generated the
following stream of uniform observations:


0 .473, 0.858, 0.501, 0.676, 0. 240.
Free download pdf