Discrete Mathematics for Computer Science

(Romina) #1

(^538) CHAPTER 8 Discrete Probability
by Theorem 1. Hence,
1 --n or2
Var(Y) = L Var(Xi) =
i=1
Because of their importance, we highlight these results about the mean and the vari-
ance of i.i.d. random variables.


The Mean and Variance of the Average

The average of n i.i.d. random variables with mean /c and variance a2 is a random
variable that also has mean g but variance o"^2 /n.

We pointed out that the actual value X (w) of a random variable X can differ drastically
from the mean A of X. Theorem 2 gave an upper bound for the probability that X differs

from [t by E or more. If we choose a small value for E-say, E < r-then the bound just

tells us that this probability is at most some number greater than or equal to one, which we
know anyway. The next theorem makes a stronger statement in the case that the random
variable X happens to be the average of i.i.d. random variables X1, X 2 .... Xn.
Theorem 5. (Law of Averages, or The Weak Law of Large Numbers) Let n E N.
Let X 1 , X2 ..... Xn be i.i.d. random variables on a sample space 0Ž, and let it and^02
denote their common mean E(Xi) and variance Var(Xi). Then, for 6 > 0,
p (I X1 -'+-X2 -'+ " Xn -/ <E U.2<
n -- - n.*c2

Proof. From the discussion following Theorem 4, we know that /tk is also the expectation

of the random variable
Y (XI Jr"X2 "- "• J"Xn)
n
and that 0.^2 /n is the variance of Y. Applying Theorem 2 to Y to get a bound on
P(IY - E(Y)J > e)
gives
p ( X + X2 +'"+Xnn _) </ (.2/n)_ -

n 6-2

n E
2

The Law of Averages is particularly interesting if X 1 , X2 ..... Xn arise in a Bernoulli
trials process where each Xi is associated with the outcome of some experiment that is
repeated n times. The Xi are i.i.d., there is no restriction on their common distribution,
and we can choose how many times n to repeat the experiment. Furthermore, the expected
value t of the average
y X1 + X2 +... + Xn
n
Free download pdf