Anon

(Dana P.) #1

Descriptive Statistics 339


In equation (A.11), for each observation, the deviation of the first com-
ponent from its mean is multiplied by the deviation of the second compo-
nent from its mean. The sample covariance is then the average of all joint
deviations. Some tedious calculations lead to an equivalent representation
of equation (A.11)


==∑ −
=

sxy
n

cov(,) vw xy

1

xy ii
i

n
,
1

which is a transformation analogous to the one already presented for
variances.
The covariance of independent variables is equal to zero. The converse,
however, is not generally true; that is, one cannot automatically conclude
independence from zero covariance. This statement is one of the most
important results in statistics and probability theory. Technically, if the
covariance of x and y is zero, the two variables are said to be uncorrelated.
For any value of cov(x,y) different from zero, the variables are correlated.
Since two variables with zero covariance are uncorrelated but not automati-
cally independent, it is obvious that independence is a stricter criterion than
no correlation.^8
This concept is exhibited in Figure A.5. In the plot, the two sets repre-
senting correlated and uncorrelated variables are separated by the dashed
line. Inside of the dashed line, we have uncorrelated variables while the
correlated variables are outside. Now, as we can see by the dotted line, the
set of independent variables is completely contained within the dashed oval
of uncorrelated variables. The complementary set outside the dotted circle
(i.e., the dependent variables) contains all of the correlated as well as part
of the uncorrelated variables. Since the dotted circle is completely inside
of the dashed oval, we see that independence is a stricter requirement than
uncorrelatedness.
The concept behind Figure A.5 of zero covariance with dependence can
be demonstrated by a simple example. Consider two hypothetical securities,
x and y, with the payoff pattern given in Table A.2. In the left column below
y, we have the payoff values of security y while in the top row we have the
payoff values of security x. Inside of the table are the joint frequencies of the
pairs (x,y). As we can see, each particular value of x occurs in combination
with only one particular value of y. Thus, the two variables (i.e., the payoffs
of x and y) are dependent. We compute the means of the two variables to be


(^8) The reason is founded in the fact that the terms in the sum of the covariance can
cancel out each other even though the variables are not independent.

Free download pdf