Advanced High-School Mathematics

(Tina Meador) #1

SECTION 6.1 Discrete Random Variables 319


μX = E(X) =


xiP(X=xi),

where the sum is over all possible valuesxiwhich the random variableX
can assume. As we’ll see, the above is often an infinite series! This value
can be interpreted as the average value ofXover many observations of
X. (We’ll give a slightly more precise formulation of this in section??.)
For example, ifXis the random variable associated with the above dice
game, then


E(X) = 2×

1

36

+ 3×

2

36

+ 4×

3

36

+ 5×

5

36

+ 6×

5

36

+ 7×

6

36

+ 8×

5

36

+ 9×

4

36

+ 10×

3

36

+ 11×

2

36

+ 12×

1

36

≈ 7. 14.

LetXandY be two discrete random variables; we wish to consider
the meanE(X+Y) of the sumX+Y. While it’s probably intuitively
plausible, if not downright obvious, thatE(X+Y) = E(X) +E(Y),
this still deserves a proof.^2
So we assume that X andY are discrete random variables having
means E(X) = μX and E(Y) = μY, respectively. Of fundamental
importance to the ensuing analysis is that for any valuex, then the
probabilitiesP(X=x) can be expressed in terms of conditional prob-
abilities^3 onY:


P(X=x) =

∑∞
j=1

P(X=x|Y =yj)P(Y =yj). (6.1)

Likewise, the probabilities P(Y = y) can be similarly expressed in
terms of conditional probabilities onX:


(^2) Elementary textbooks typically only prove this under the simplifying assumption thatXandY
are independent.
(^3) Here, we have assumed that the students have already had some exposure to conditional proba-
bilities. Recall that for any two eventsAandBtheprobability ofAconditioned onBis given
by
P(A|B) =P(APand(B)B).

Free download pdf