Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
6.5. Multiparameter Case: Testing 399

where the parameter space isω={p:0<p< 1 / 2 }. The likelihood underωis

L(p)=pt^1 +t^2 (1− 2 p)n−t^1 −t^2. (6.5.15)

Differentiating logL(p) with respect topand setting the derivative to 0 results in
the following maximum likelihood estimate, underω:


̂p 0 =

t 1 +t 2
2 n

=

p̂ 1 +̂p 2
2

, (6.5.16)

wherêp 1 andp̂ 2 are the mles under Ω. The likelihood function evaluated at the mle
underωsimplifies to


L(ˆω)=

(
pˆ 1 +ˆp 2
2

)n(ˆp 1 +ˆp 2 )
(1−pˆ 1 −ˆp 2 )n(1−pˆ^1 −pˆ^2 ). (6.5.17)

The reciprocal of the likelihood ratio test statistic then simplifies to


Λ−^1 =

(
2 p̂ 1
̂p 1 +p̂ 2

)npb 1 (
2 p̂ 2
p̂ 1 +̂p 2

)nbp 2

. (6.5.18)


Based on Theorem 6.5.11, an asymptotic levelαtest rejectsH 0 if 2 log Λ−^1 >χ^2 α(1).
This is an example where the Wald’s test can easily be formulated. The con-
straint underH 0 isp 1 −p 2 = 0. Hence, the Wald-type statistic isW=̂p 1 −p̂ 2 ,
which can be expressed asW=[1,−1][p̂ 1 ;̂p 2 ]′. Recall that the information matrix
and its inverse were found forkcategories in Example 6.4.5. From Theorem 6.4.1,
we then have
[
p̂ 1
p̂ 2


]
is approximatelyN 2

((
p 1
p 2

)
,^1 n

[
p 1 (1−p 1 ) −p 1 p 2
−p 1 p 2 p 2 (1−p 2 )

])

. (6.5.19)


As shown in Example 6.4.5, the finite sample moments are the same as the asymp-
totic moments. Hence the variance ofWis


Var(W)=[1,−1]

1
n

[
p 1 (1−p 1 ) −p 1 p 2
−p 1 p 2 p 2 (1−p 2 )

][
1
− 1

]

=

p 1 +p 2 −(p 1 −p 2 )^2
n

.

BecauseWis asymptotically normal, an asymptotic levelαtest for the hypotheses
(6.5.13) is to rejectH 0 ifχ^2 W≥χ^2 α(1), where

χ^2 W=

(̂p 1 −p̂ 2 )^2
(̂p 1 +p̂ 2 −(̂p 1 −p̂ 2 )^2 )/n

. (6.5.20)


It also follows that an asymptotic (1−α)100% confidence interval for the difference
p 1 −p 2 is


p̂ 1 −p̂ 2 ±zα/ 2

(
p̂ 1 +p̂ 2 −(̂p 1 −p̂ 2 )^2
n

) 1 / 2

. (6.5.21)

Free download pdf