This result leads immediately to an important generalization. Consider a
function of X and Y in the form g(X)h(Y) for which an expectation exists.
Then, if X and Y are independent,
When the correlation coefficient of two random variables vanishes, we say
they are uncorrelated. It should be carefully pointed out that what we have
shown is that independence implies zero correlation. The converse, however, is
not true. This point is more fully discussed in what follows.
The covariance or the correlation coefficient is of great importance in the
analysis of two random variables. It is a measure of their linear interdependence
in the sense that its value is a measure of accuracy with which one random
variable can be approximated by a linear function of the other. In order to see
this, let us consider the problem of approximating a random variable X by a
linear function of a second random variable Y,aY b, where a and b are
chosen so that the mean-square error e, defined by
is minimized. Upon taking partial der ivatives of e with respect to a and b and
setting them to zero, straightforward calculations show that this minimum is
attained when
and
Substituting these values into Equation (4.29) then gives
minimum mean-square error. We thus see that an exact fit in the mean-square
sense is achieved when , and the linear approximation is the worst when
- More sp ecifically, when 1, the random variables X and Y are said
to be positively perfectly correlated in the sense that the values they assume fall
on a straight line with positive slope; they are negatively perfectly correlated
when and their values form a straight line with negative slope. These
two extreme cases are illustrated in F igure 4.3. The value of decreases as
scatter about these lines increases.
Let us again stress the fact that the correlation coefficient measures only the
linear interdependence between two random variables. It is by no means a
general measure of interdep en dence between X and Y. Thus, does not
imply independence of the random variables. In fact, Example 4.10 shows, the
90 Fundamentals of Probability and Statistics for Engineers
Efg
Xh
YgEfg
XgEfh
Yg:
4 : 28
eEfX
aYb^2 g;
4 : 29
a
X
Y
bmXamY
^2 X1^2 ) as the
jj 1
1
jj