Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
2.4. Independent Random Variables 119

SinceX+Y≤4, it would seem thatXandYare dependent. To see that this is
true by definition, we first find the marginal pmf’s which are:

pX(x)=

( 10
x

)( 15
4 −x

)
( 25
4

) , 0 ≤x≤4;

pY(y)=

( 8
y

)( 17
4 −y

)
( 25
4

) , 0 ≤y≤ 4.

To show dependence, we need to find only one point in the support of (X 1 ,X 2 )where
the joint pmf does not factor into the product of the marginal pmf’s. Suppose we
select the pointx=1andy= 1. Then, using R for calculation, we compute (to 4
places):


p(1,1) = 10· 8 ·

(
7
2

)
/

(
25
4

)
=0. 1328

pX(1) = 10

(
15
3

)
/

(
25
4

)
=0. 3597

pY(1) = 8

(
17
3

)
/

(
25
4

)
=0. 4300.

Since 0. 1328
=0.1547 = 0. 3597 · 0 .4300,XandYare dependent random variables.

Example 2.4.2.Let the joint pdf ofX 1 andX 2 be

f(x 1 ,x 2 )=

{
x 1 +x 2 0 <x 1 < 1 , 0 <x 2 < 1
0elsewhere.

We show that X 1 andX 2 are dependent. Here the marginal probability density
functions are


f 1 (x 1 )=

{∫∞
−∞f(x^1 ,x^2 )dx^2 =

∫ 1
0 (x^1 +x^2 )dx^2 =x^1 +

1
2 0 <x^1 <^1
0elsewhere,

and

f 2 (x 2 )=

{∫∞
−∞f(x^1 ,x^2 )dx^1 =

∫ 1
0 (x^1 +x^2 )dx^1 =

1
2 +x^20 <x^2 <^1
0elsewhere.

Sincef(x 1 ,x 2 )
≡f 1 (x 1 )f 2 (x 2 ), the random variablesX 1 andX 2 are dependent.

The following theorem makes it possible to assert that the random variablesX 1
andX 2 of Example 2.4.2 are dependent, without computing the marginal probability
density functions.


Theorem 2.4.1. Let the random variablesX 1 andX 2 have supportsS 1 andS 2 ,
respectively, and have the joint pdff(x 1 ,x 2 ).ThenX 1 andX 2 are independent if

Free download pdf