Applied Statistics and Probability for Engineers

(Chris Devlin) #1
142 CHAPTER 5 JOINT PROBABILITY DISTRIBUTIONS

CD MATERIAL


  1. Determine the distribution of a function of one or more random variables

  2. Calculate moment generating functions and use them to determine moments for random variables
    and use the uniqueness property to determine the distribution of a random variable

  3. Provide bounds on probabilities for arbitrary distributions based on Chebyshev’s inequality


Answers for most odd numbered exercises are at the end of the book. Answers to exercises whose
numbers are surrounded by a box can be accessed in the e-Text by clicking on the box. Complete
worked solutions to certain exercises are also available in the e-Text. These are indicated in the
Answers to Selected Exercises section by a box around the exercise number. Exercises are also
available for the text sections that appear on CD only. These exercises may be found within the
e-Text immediately following the section they accompany.

In Chapters 3 and 4 we studied probability distributions for a single random variable. However,
it is often useful to have more than one random variable defined in a random experiment. For ex-
ample, in the classification of transmitted and received signals, each signal can be classified as
high, medium, or low quality. We might define the random variable Xto be the number of high-
quality signals received and the random variable Yto be the number of low-quality signals
received. In another example, the continuous random variable Xcan denote the length of one di-
mension of an injection-molded part, and the continuous random variable Ymight denote the
length of another dimension. We might be interested in probabilities that can be expressed in
terms of both Xand Y. For example, if the specifications for Xand Yare (2.95 to 3.05) and (7.60
to 7.80) millimeters, respectively, we might be interested in the probability that a part satisfies
both specifications; that is, P(2.95 X 3.05 and 7.60 Y 7.80).
In general, if Xand Yare two random variables, the probability distribution that defines
their simultaneous behavior is called a joint probability distribution.In this chapter, we
investigate some important properties of these joint distributions.

5-1 TWO DISCRETE RANDOM VARIABLES

5-1.1 Joint Probability Distributions

For simplicity, we begin by considering random experiments in which only two random vari-
ables are studied. In later sections, we generalize the presentation to the joint probability
distribution of more than two random variables.

EXAMPLE 5-1 In the development of a new receiver for the transmission of digital information, each re-
ceived bit is rated as acceptable, suspect,or unacceptable,depending on the quality of the
received signal, with probabilities 0.9, 0.08, and 0.02, respectively. Assume that the ratings of
each bit are independent.
In the first four bits transmitted, let
Xdenote the number of acceptable bits
Ydenote the number of suspect bits
Then, the distribution of Xis binomial with n4 and p0.9, and the distribution of Yis
binomial with n4 and p0.08. However, because only four bits are being rated, the possible
values of Xand Yare restricted to the points shown in the graph in Fig. 5-1. Although the possi-
ble values of Xare 0, 1, 2, 3, or 4, if y3, x0 or 1. By specifying the probability of each of
the points in Fig. 5-1, we specify the joint probability distribution of Xand Y. Similarly to an in-
dividual random variable, we define the range of the random variables (X, Y) to be the set of
points (x, y) in two-dimensional space for which the probability that Xxand Yyis positive.

c 05 .qxd 5/13/02 1:49 PM Page 142 RK UL 6 RK UL 6:Desktop Folder:TEMP WORK:MONTGOMERY:REVISES UPLO D CH114 FIN L:Quark Files:

Free download pdf