Statistical Methods for Psychology

(Michael S) #1
Looking at our original data in Table 17.1, we note that in the low fault condition
153 people were found guilty and 24 were found not guilty. Thus, the conditional odds
of being judged guilty giventhat the victim was seen as low on Fault is 153/24 5
6.3750. (This can be read to mean that in the low fault group the odds in favor of being
found guilty are 6.3750:1.) For every person in that group who is found not guilty, 6.375
are found guilty. These are equivalent ways of saying the same thing. The conditional odds
of being found guilty given that the victim is seen as having a highdegree of fault are only
105/76 5 1.3816.
If there had been no interaction between Fault and Verdict, the odds of being found
guilty would have been the same in the two Fault conditions. Therefore, the ratio
of the two odds would have been approximately 1.00. Instead, the ratio of the two con-
ditional odds, the odds ratio ( ),is 6.3750/1.3816 5 4.6142. The odds that a defen-
dant will be found guilty in the low fault condition are about 4.6 times greater than
in the high fault condition. The “blame the victim” strategy, whether fair or not, seems
to work.
An important feature of the odds ratio is that it is independent of the size of the sam-
ple, whereas is not. A second advantage is that within the context of a 2 3 2 table, a
test on the odds ratio would be equivalent to a likelihood ratio test of independence.
A third advantage of is that its magnitude will not be artificially affected by the pres-
ence of unequal marginal distributions. In other words, if we doubled the number of
cases in the high fault condition (but still held other things constant), (either Pear-
son’s or likelihood ratio) and phi would change. The odds ratio ( ), however, would
not be affected.

17.5 Treatment Effects (Lambda)


As we have already seen, log-linear models have a nice parallel with the analysis of variance,
and that parallelism extends to the treatment effects. In the analysis of variance, treatment
effects are denoted by terms like m, , , and , whereas in log-linear models we denote
these effects by , , , and. As you know, log-linear models work with the natural logs
of frequencies rather than with the frequencies themselves.
In Section 17.3 we saw that the independence model (the model without the inter-
action term) did not fit the data from Pugh’s study ( 5 37.35). To model the data ad-
equately, we are going to have to use a model that contains the interaction term:

. Remember that for the fully saturated model the observed
and expected frequencies are the same. Thus we will start with the logs of these frequen-
cies as the raw data, as shown in Table 17.7. Notice that the table also contains the row and
column marginal means and the grand mean.


ln(Fij)=l1lVi 1lFj 1lVFij

x^2

llVi lFj lVFij

ai bj abij

Æ


x^2

Æ


x^2

x^2

Æ


642 Chapter 17 Log-Linear Analysis


Table 17.7 Natural logs of cell frequencies
Verdict
Guilty Not Guilty Marginals
High 5.03043 3.17805 4.10424
Fault Low 4.65396 4.33073 4.49235

Marginals 4.84220 3.75439 4.29830

conditional odds


odds ratio ( )Æ

Free download pdf