Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
126 Multivariate Distributions

It follows by the linearity of expectation, Theorem 2.1.1, that the covariance of
XandY can also be expressed as


cov(X, Y)=E(XY−μ 2 X−μ 1 Y+μ 1 μ 2 )
= E(XY)−μ 2 E(X)−μ 1 E(Y)+μ 1 μ 2
= E(XY)−μ 1 μ 2 , (2.5.2)

which is often easier to compute than using the definition, (2.5.1).
The measure that we seek is a standardized (unitless) version of the covariance.

Definition 2.5.2.If each ofσ 1 andσ 2 is positive, then thecorrelation coefficient
betweenXandYis defined by

ρ=
E[(X−μ 1 )(Y−μ 2 )]
σ 1 σ 2

=
cov(X, Y)
σ 1 σ 2

. (2.5.3)


It should be noted that the expected value of the product of two random variables
is equal to the product of their expectations plus their covariance; that is,E(XY)=
μ 1 μ 2 +cov(X, Y)=μ 1 μ 2 +ρσ 1 σ 2.
As illustrations, we present two examples. The first is for a discrete model while
the second concerns a continuous model.


Example 2.5.1.Reconsider the random vector (X 1 ,X 2 ) of Example 2.1.1 where a
fair coin is flipped three times andX 1 is the number of heads on the first two flips
whileX 2 is the number of heads on all three flips. Recall that Table 2.1.1 contains
the marginal distributions ofX 1 andX 2. By symmetry of these pmfs, we have
E(X 1 )=1andE(X 2 )=3/2. To compute the correlation coefficient of (X 1 ,X 2 ),
we next sketch the computation of the required moments:


E(X^21 )=

1
2
+2^2 ·

1
4
=

3
2
⇒σ^21 =

3
2
− 12 =

1
2
;

E(X^22 )=

3
8

+4·

3
8

+9·

1
8

=3⇒σ^22 =3−

(
3
2

) 2
12 =

1
2

;

E(X 1 X 2 )=
2
8


+1· 2 ·
2
8

+2· 2 ·
1
8

+2· 3 ·
1
8

=2⇒cov(X 1 ,X 2 )=2− 1 ·
3
2

=
1
2

From which it follows thatρ=(1/2)/(


(1/2)


3 /4) = 0.816.

Example 2.5.2.Let the random variablesXandYhave the joint pdf

f(x, y)=

{
x+y 0 <x< 1 , 0 <y< 1
0elsewhere.

We next compute the correlation coefficientρofXandY.Now

μ 1 =E(X)=

∫ 1

0

∫ 1

0

x(x+y)dxdy=

7
12
Free download pdf