Introduction to Probability and Statistics for Engineers and Scientists

(Sean Pound) #1

96 Chapter 4:Random Variables and Expectation


we might be interested in the relationship between the average number of cigarettes
smoked daily and the age at which an individual contracts cancer. Similarly, an engi-
neer might be interested in the relationship between the shear strength and the diameter
of a spot weld in a fabricated sheet steel specimen.
To specify the relationship between two random variables, we define the joint
cumulative probability distribution function ofXandYby


F(x,y)=P{X≤x,Y≤y}

A knowledge of the joint probability distribution function enables one, at least in theory, to
compute the probability of any statement concerning the values ofXandY. For instance,
the distribution function ofX— call itFX— can be obtained from the joint distribution
functionFofXandYas follows:


FX(x)=P{X≤x}
=P{X≤x,Y<∞}
=F(x,∞)

Similarly, the cumulative distribution function ofYis given by


FY(y)=F(∞,y)

In the case whereXandY are both discrete random variables whose possible values
are, respectively,x 1 ,x 2 ,..., andy 1 ,y 2 ,..., we define thejoint probability mass functionof
XandY,p(xi,yj), by


p(xi,yj)=P{X=xi,Y=yj}

The individual probability mass functions ofXandYare easily obtained from the joint
probability mass function by the following reasoning. SinceYmust take on some valueyj,
it follows that the event{X=xi}can be written as the union, over allj, of the mutually
exclusive events{X=xi,Y=yj}. That is,


{X=xi}=


j

{X=xi,Y=yj}

and so, using Axiom 3 of the probability function, we see that


P{X=xi}=P




j

{X=xi,Y=yj}


 (4.3.1)

=


j

P{X=xi,Y=yj}

=


j

p(xi,yj)
Free download pdf