Introduction to Probability and Statistics for Engineers and Scientists

(Sean Pound) #1

102 Chapter 4:Random Variables and Expectation


Hence, in terms of the joint distribution functionFofXandY, we have thatXandY
are independent if


F(a,b)=FX(a)FY(b) for alla,b

When X and Y are discrete random variables, the condition of independence
Equation 4.3.7 is equivalent to


p(x,y)=pX(x)pY(y) for allx,y (4.3.8)

wherepXandpYare the probability mass functions ofXandY. The equivalence follows
because, if Equation 4.3.7 is satisfied, then we obtain Equation 4.3.8 by lettingAandB
be, respectively, the one-point setsA={x},B={y}. Furthermore, if Equation 4.3.8 is
valid, then for any setsA,B


P{X∈A,Y∈B}=


y∈B


x∈A

p(x,y)

=


y∈B


x∈A

pX(x)pY(y)

=


y∈B

pY(y)


x∈A

pX(x)

=P{Y∈B}P{X∈A}

and thus Equation 4.3.7 is established.
In the jointly continuous case, the condition of independence is equivalent to


f(x,y)=fX(x)fY(y) for allx,y

Loosely speaking,XandYare independent if knowing the value of one does not change
the distribution of the other. Random variables that are not independent are said to be
dependent.


EXAMPLE 4.3d Suppose thatX andY are independent random variables having the
common density function


f(x)=

{
e−x x> 0
0 otherwise

Find the density function of the random variableX/Y.

Free download pdf