untitled

(Brent) #1
progressed? The null hypothesis is that the average seen per transect per day is inde-
pendent of the day order.
Note that factor DAYcontains three levels, the first day, the second day, and the
third day. The last contains only six replicates in contrast to the eight of the first
two days. It will make the point that the arithmetic of one-factor ANOVAdoes not require
that the design is balanced (i.e. the number of replicates is the same for all levels).
The analysis can be run without balance although the result must be interpreted more
cautiously. Balance should always be sought, if not necessarily always attained.
The analysis of Box 16.1 leads to an Fratio (named for R.A. Fisher who invented
analysis of variance) testing the null hypothesis. Appendix 1 gives its critical values.
The probability of 20% is too high to call the null hypothesis into serious question.
That value is the probability of drawing by chance three daily samples as disparate
or more disparate than those we did draw, when there is no difference in density or
sightability between days. We would require a probability value of around 10% before
we became suspicious of the null hypothesis, and one below 5% before we rejected
the null hypothesis in favor of some alternative explanation.

A two-factor ANOVAtests simultaneously for an effect of two separate factors on a response
variable and for an interaction between them. Even though the arithmetic is simply
a generalization of the one-factor case, the two-factor ANOVAdiffers in kind from the
one-factor because of the interaction term. There are also a number of other differ-
ences, but we will get to them after we have considered an example.
Data for a two-factor ANOVAare laid out as a two-dimensional matrix with the rows
representing the levels of one factor and the columns the levels of the other. These
are interchangeable. Each cell of the matrix contains the replicate readings of the
response variable, whatever it is. Table 16.1 outlines symbolically and formally the
calculation of the sums of squares and degrees of freedom for the four components
into which the total sum of squares is split: the effect on the response variable of the
factor represented by the rows, the effect of the factor represented by the columns,
the effect of the interaction between them (of which more soon), and the remaining
or residual sum of squares which represents the average intrinsic variation within
each treatment cell and which therefore is not ascribable to either the factors or their
interaction.
Box 16.2 provides a set of data amenable to a two-factor ANOVA. As with the one-
factor example they are real data from an aerial survey whose purpose was to estab-
lish whether the counts obtained on a given day were influenced by the disturbance
or habituation imparted by the survey flying of previous days. However, two species
were counted this time, the red kangaroo and the eastern gray kangaroo, and since
they might well react in differing ways to the sound of a low-flying aircraft their counts

EXPERIMENTAL MANAGEMENT 281

16.6.2Two-factor
ANOVA

ROWeffect (1/nc)∑Ti^2 −(1/nrc)T^2 d.f. =r− 1
COLUMNeffect (1/nr)∑Tj^2 −(1/nrc)T^2 d.f. =c− 1
ROW×COLUMNeffect (1/n)∑Tij^2 −(1/nc)∑Ti^2 −(1/nr)∑Tj^2 +(1/nrc)T^2 d.f. =(r−1)(c−1)
Residual ∑Xijk^2 −(1/n)∑Tij^2 d.f. =rc(n−1)
Total ∑Xijk^2 −(1/nrc)T^2 d.f. =rcn− 1

Tij, total of replicates in the cell at the ith row and jth column; Ti, total of replicates in the ith row; Tj, total of
replicates in the jth column; T, grand total; r, number of rows; c, number of columns; n, number of replicates per cell.

Table 16.1Calculations
of sums of squares for
two-factor ANOVA.

WECC16 18/08/2005 14:47 Page 281

Free download pdf