Discrete Mathematics for Computer Science

(Romina) #1

540 CHAPTER 8 Discrete Probability



  1. Continuation of Exercise 5. Record the average number of heads obtained for each
    run of 100 flips of a fair coin. Run the experiment many times. What proportion
    of these experiments produce an average number of heads that differ from the ex-
    pected value for the average by 0.1 or more? Compare with the results of the Exer-
    cise 5.

  2. Let a random variable X have probability density function


x 1 2 6 8
p(X = x) 0.4 0.1 0.3 0.2

Compute the variance and standard deviation of X with [L = 4.


  1. The probability density function for the random variable X defined to be the number
    of cars owned by a randomly selected family in Millinocket is given as


x 0 1 2 3 4

p(X = x) 0.08 0.15 0.45 0.27 0.05

Compute the variance and standard deviation of X.

Chapter Review


This chapter introduced the notion of a probability density function p defined on the out-
comes (o of a sample space Q2. The challenge of elementary probability theory is to set up a
suitable sample space and density function so that the situation of interest can be expressed
in terms of events E C QŽ. This takes practice.
First, we showed that using set theory to express sets in terms of other sets and us-
ing counting techniques to determine the size of sets play a crucial role in calculating
probabilities. As the Birthday Problem illustrated, it sometimes is extremely convenient to
compute the probability of an event by determining the probability of its complement and
then subtracting that from one.
The discussion of cross product sample spaces explained how to set up a sample
space and compute probabilities for situations that involve repeated trials of an experi-
ment, such as flipping a coin over and over, or combining several unrelated experiments,
such as checking the status of various communication links. We proved that when k events
Eil, Ei2 .... Eik occur simultaneously in k different sample spaces Qil, Ii2 ..... I ,
then the probability of such a combined simultaneous event is the product P(Efi).
P (Ei 2 ) ... P (Ei).
The material on conditional probability showed how to revise probabilities in light
of new information. If events are independent, then no revision is necessary. Otherwise,
Bayes' Rule, which often is used with the Theorem of Total Probability, provides a power-
ful tool for computing conditional probabilities.
Finally, we introduced the idea of a random variable and its expected value. Since
a random variable may have a value that is very different from its expected value, we
introduced the notion of variance and standard deviation of a random variable. It was shown
that expectation is linear. For example, the expectation of a sum of random variables is just
Free download pdf