Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
346 Consistency and Limiting Distributions

This is readily established by using the CLT and the same reasoning as in Example
5.3.1; see Exercise 5.3.13.
In Examples 4.2.3 and 4.5.2 of Chapter 4, we presented large sample confidence
intervals and tests forpusing (5.3.4).

Example 5.3.6(Large Sample Inference forχ^2 -Tests).Another extension of Ex-
ample 5.3.3 that was used in Section 4.7 follows quickly from the Central Limit
Theorem and Theorem 5.2.4. Using the notation of Example 5.3.3, supposeYn
has a binomial distribution with parametersnandp. Then, as in Example 5.3.3,
(Yn−np)/



np(1−p) converges in distribution to a random variableZwith the
N(0,1) distribution. Hence, by Theorem 5.2.4,


(
Yn−np

np(1−p)

) 2
D
→χ^2 (1). (5.3.5)

This was the result referenced in Chapter 4; see expression (4.7.1).

We know thatXand

∑n
1 Xihave approximately normal distributions, provided
thatnis large enough. Later, we find that other statistics also have approximate
normal distributions, and this is the reason that the normal distribution is so impor-
tant to statisticians. That is, while not many underlying distributions are normal,
the distributions of statistics calculated from random samples arising from these
distributions are often very close to being normal.
Frequently, we are interested in functions of statistics that have approximately
normal distributions. To illustrate, consider the sequence of random variableYnof
Example 5.3.3. As discussed there,Ynhas an approximateN[np, np(1−p)]. So
np(1−p) is an important function ofp, as it is the variance ofYn.Thus,ifpis
unknown, we might want to estimate the variance ofYn.SinceE(Yn/n)=p,we
might usen(Yn/n)(1−Yn/n) as such an estimator and would want to know some-
thing about the latter’s distribution. In particular, does it also have an approximate
normal distribution? If so, what are its mean and variance? To answer questions
like these, we can apply the Δ-method, Theorem 5.2.9.
As an illustration of the Δ-method, we consider a function of the sample mean.
Assume thatX 1 ,...,Xnis a random sample onXwhich has finite meanμand vari-
anceσ^2. Then rewriting expression (5.3.2) we have by the Central Limit Theorem
that √
n(X−μ)→DN(0,σ^2 ).


Hence, by the Δ-method, Theorem 5.2.9, we have



n[g(X)−g(μ)]→DN(0,σ^2 (g′(μ))^2 ), (5.3.6)

for a continuous transformationg(x) such thatg′(μ) =0.

Example 5.3.7.Assume that we are sampling from a binomialb(1,p) distribution.
ThenXis the sample proportion of successes. Hereμ=pandσ^2 =p(1−p).
Suppose that we want a transformationg(p) such that the transformed asymptotic

Free download pdf