Mathematical Methods for Physics and Engineering : A Comprehensive Guide

(lu) #1

30.12 PROPERTIES OF JOINT DISTRIBUTIONS


One particularly useful consequence of its definition is that the covariance

of twoindependentvariables,XandY, is zero. It immediately follows from


(30.134) that their correlation is also zero, and this justifies the use of the term


‘uncorrelated’ for two such variables. To show this extremely important property


we first note that


Cov[X, Y]=E[(X−μX)(Y−μY)]

=E[XY−μXY−μYX+μXμY]

=E[XY]−μXE[Y]−μYE[X]+μXμY
=E[XY]−μXμY. (30.135)

Now, ifXandY are independent thenE[XY]=E[X]E[Y]=μXμY and so


Cov[X, Y] = 0. It is important to note that the converse of this result is not


necessarily true; two variables dependent on each other can still be uncorrelated.


In other words, it is possible (and not uncommon) for two variablesXandY


to be described by a joint distributionf(x, y)thatcannotbe factorised into a


product of the formg(x)h(y), but for which Corr[X, Y] = 0. Indeed, from the


definition (30.133), we see that for any joint distributionf(x, y) that is symmetric


inxaboutμX(or similarly iny) we have Corr[X, Y]=0.


We have already asserted that if the correlation of two random variables is

positive (negative) they are said to be positively (negatively) correlated. We have


also stated that the correlation lies between−1 and +1. The terminology suggests


that if the two RVs are identical (i.e.X=Y) then they are completely correlated


and that their correlation should be +1. Likewise, ifX=−Ythen the functions


are completely anticorrelated and their correlation should be−1. Values of the


correlation function between these extremes show the existence of some degree


of correlation. In fact it is not necessary thatX=Yfor Corr[X, Y] = 1; it is


sufficient thatYis a linear function ofX,i.e.Y=aX+b(withapositive). Ifa


is negative then Corr[X, Y]=−1. To show this we first note thatμY=aμX+b.


Now


Y=aX+b=aX+μY−aμX ⇒ Y−μY=a(X−μX),

and so using the definition of the covariance (30.133)


Cov[X, Y]=aE[(X−μX)^2 ]=aσX^2.

It follows from the properties of the variance (subsection 30.5.3) thatσY=|a|σX


and so, using the definition (30.134) of the correlation,


Corr[X, Y]=

aσ^2 X
|a|σ^2 X

=

a
|a|

,

which is the stated result.


It should be noted that, even if the possibilities ofXandYbeing non-zero are

mutually exclusive, Corr[X, Y] need not have value±1.

Free download pdf