Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
2.4. Independent Random Variables 121

Instead of working with pdfs (or pmfs) we could have presented independence
in terms of cumulative distribution functions. The following theorem shows the
equivalence.


Theorem 2.4.2.Let(X 1 ,X 2 )have the joint cdfF(x 1 ,x 2 )and letX 1 andX 2 have
the marginal cdfsF 1 (x 1 )andF 2 (x 2 ), respectively. ThenX 1 andX 2 are independent
if and only if


F(x 1 ,x 2 )=F 1 (x 1 )F 2 (x 2 ) for all(x 1 ,x 2 )∈R^2. (2.4.1)

Proof:We give the proof for the continuous case. Suppose expression (2.4.1) holds.
Then the mixed second partial is

∂^2
∂x 1 ∂x 2

F(x 1 ,x 2 )=f 1 (x 1 )f 2 (x 2 ).

Hence,X 1 andX 2 are independent. Conversely, supposeX 1 andX 2 are indepen-
dent. Then by the definition of the joint cdf,


F(x 1 ,x 2 )=

∫x 1

−∞

∫x 2

−∞

f 1 (w 1 )f 2 (w 2 )dw 2 dw 1

=

∫x 1

−∞

f 1 (w 1 )dw 1 ·

∫x 2

−∞

f 2 (w 2 )dw 2 =F 1 (x 1 )F 2 (x 2 ).

Hence, condition (2.4.1) is true.


We now give a theorem that frequently simplifies the calculations of probabilities
of events that involves independent variables.


Theorem 2.4.3.The random variablesX 1 andX 2 are independent random vari-
ables if and only if the following condition holds,


P(a<X 1 ≤b, c < X 2 ≤d)=P(a<X 1 ≤b)P(c<X 2 ≤d) (2.4.2)

for everya<bandc<d,wherea, b, c,anddare constants.

Proof:IfX 1 andX 2 are independent, then an application of the last theorem and
expression (2.1.2) shows that


P(a<X 1 ≤b, c < X 2 ≤d)=F(b, d)−F(a, d)−F(b, c)+F(a, c)
= F 1 (b)F 2 (d)−F 1 (a)F 2 (d)−F 1 (b)F 2 (c)
+F 1 (a)F 2 (c)
=[F 1 (b)−F 1 (a)][F 2 (d)−F 2 (c)],

which is the right side of expression (2.4.2). Conversely, condition (2.4.2) implies
that the joint cdf of (X 1 ,X 2 ) factors into a product of the marginal cdfs, which in
turn by Theorem 2.4.2 implies thatX 1 andX 2 are independent.

Free download pdf