Logistic Regression: A Self-learning Text, Third Edition (Statistics in the Health Sciences)

(vip2019) #1

How the LR test works:


IfX 3 makes a large contribution,
thenL^ 2 much greater thanL^ 1


IfL^ 2 much larger thanL^ 1 , then


L^ 1
L^ 2

 0


[Note.lne(fraction)¼negative]


)ln

L^ 1


L^ 2





lnð 0 Þ¼1

)LR¼2ln

L^ 1


L^ 2





1


Thus,X 3 highly significant )LR
large and positive.


Algebraically, this difference can also be writ-
ten as2 times the natural log of the ratio ofL^ 1
divided byL^ 2 , shown on the right-hand side of
the equation here. This latter version of the test
statistic is a ratio of maximized likelihood
values; this explains why the test is called the
likelihood ratio test.

The likelihood ratio statistic for this example
has approximately a chi-square distribution if
the study size is large. The degrees of freedom
for the test is one because, when comparing
Models 1 and 2, only one parameter, namely,
b 3 , is being set equal to zero under the null
hypothesis.

We now describe how the likelihood ratio test
works and why the test statistic is approxi-
mately chi square. We consider what the
value of the test statistic would be if the addi-
tional variableX 3 makes an extremely large
contribution to the risk of disease over that
already contributed byX 1 andX 2. Then, it fol-
lows that the maximized likelihood valueL^ 2 is
much larger than the maximized likelihood
valueL^ 1.

IfL^ 2 is much larger thanL^ 1 , then the ratioL^ 1
divided byL^ 2 becomes a very small fraction;
that is, this ratio approaches 0.

Now the natural log of any fraction between
0 and 1 is a negative number. As this fraction
approaches 0, the log of the fraction, which is
negative, approaches the log of 0, which is1.

If we multiply the log likelihood ratio by2,
we then get a number that approachesþ1.
Thus, the likelihood ratio statistic for a highly
significantX 3 variable is large and positive and
approachesþ1. This is exactly the type of
result expected for a chi-square statistic.

EXAMPLE (continued)

Ratio of likelihoods

–2 ln L 1 – (–2 ln L 2 ) = –2 ln


L 1

L 2

LR approximatew^2 variable with df¼ 1
ifnlarge

136 5. Statistical Inferences Using Maximum Likelihood Techniques

Free download pdf