8.3. Likelihood Ratio Tests 493
Example 8.3.3(Power of the Two Samplet-Test).In Example 8.3.1 we had
T=
W 2
√
V 2 /(n+m−2)
,
where
W 2 =
√
nm
n+m
(X−Y)
/
σ
and
V 2 =
∑n
1
(Xi−X)^2 +
∑m
1
(Yi−Y)^2
σ^2
.
HereW 2 isN[
√
nm/(n+m)(θ 1 −θ 2 )/σ,1],V 2 isχ^2 (n+m−2), andW 2 andV 2 are
independent. Accordingly, ifθ 1
=θ 2 ,Thas a noncentralt-distribution withn+m− 2
degrees of freedom and noncentrality parameterδ 2 =
√
nm/(n+m)(θ 1 −θ 2 )/σ.It
is interesting to note thatδ 1 =
√
nθ 1 /σmeasures the deviation ofθ 1 fromθ 1 =0
in units of the standard deviationσ/
√
√ nofX. The noncentrality parameterδ^2 =
nm/(n+m)(θ 1 −θ 2 )/σis equal to the deviation ofθ 1 −θ 2 fromθ 1 −θ 2 =0in
units of the standard deviationσ/
√
(n+m)/mnofX−Y.
As in the last example, it is easy to write R code that evaluates power for this
test. For a numerical illustration, assume that the common variance isθ 3 = 100,
n= 20, andm= 15. Supposeα=0.05 and we want to determine the power
of the test to detect Δ = 5, where Δ =θ 1 −θ 2. In this case the critical value is
t 0. 25 , 33 =qt(. 975 ,33) = 2.0345 and the noncentrality parameter isδ 2 =1.4639. The
power is computed as
1- pt(2.0345,33,ncp=1.4639) + pt(-2.0345,33,ncp=1.4639) = 0.2954
Hence, the test has a 29.4% chance of detecting a difference in means of 5.
Remark 8.3.1.The one- and two-sample tests for normal means, presented in
Examples 6.5.1 and 8.3.1, are the tests for normal means presented in most elemen-
tary statistics books. They are based on the assumption of normality. What if the
underlying distributions are not normal? In that case, with finite variances, the
t-test statistics for these situations are asymptotically correct. For example, con-
sider the one-samplet-test. SupposeX 1 ,...,Xnare iid with a common nonnormal
pdf that has meanθ 1 and finite varianceσ^2. The hypotheses remain the same, i.e.,
H 0 :θ 1 =θ 1 ′versusH 1 :θ 1
=θ′ 1 .Thet-test statistic,Tn,isgivenby
Tn=
√
n(X−θ′ 1 )
Sn
, (8.3.6)
whereSnis the sample standard deviation. Our critical region isC 1 ={|Tn|≥
tα/ 2 ,n− 1 }. Recall thatSn→σin probability. Hence, by the Central Limit Theorem,
underH 0 ,
Tn=
σ
Sn
√
n(X−θ′ 1 )
σ
D
→Z, (8.3.7)