Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
68 Probability and Distributions

1.9 SomeSpecialExpectations


Certain expectations, if they exist, have special names and symbols to represent
them. First, letXbe a random variable of the discrete type with pmfp(x). Then

E(X)=


x

xp(x).

If the support ofXis{a 1 ,a 2 ,a 3 ,...}, it follows that


E(X)=a 1 p(a 1 )+a 2 p(a 2 )+a 3 p(a 3 )+···.

This sum of products is seen to be a “weighted average”of the values ofa 1 ,a 2 ,a 3 ,...,
the “weight” associated with eachaibeingp(ai). This suggests that we callE(X)
the arithmetic mean of the values ofX, or, more simply, the mean valueofX(or
the mean value of the distribution).


Definition 1.9.1(Mean).LetX be a random variable whose expectation exists.
Themean valueμofXisdefinedtobeμ=E(X).


The mean is the first moment (about 0) of a random variable. Another special
expectation involves the second moment. LetXbe a discrete random variable with
support{a 1 ,a 2 ,...}and with pmfp(x), then


E[(X−μ)^2 ]=


x

(x−μ)^2 p(x)

=(a 1 −μ)^2 p(a 1 )+(a 2 −μ)^2 p(a 2 )+···.

This sum of products may be interpreted as a “weighted average” of the squares of
the deviations of the numbersa 1 ,a 2 ,...from the mean valueμof those numbers
where the “weight” associated with each (ai−μ)^2 isp(ai). It can also be thought of
as the second moment ofXaboutμ. This is an important expectation for all types
of random variables, and we usually refer to it as thevarianceofX.


Definition 1.9.2(Variance).LetXbe a random variable with finite meanμand
such thatE[(X−μ)^2 ]is finite. Then thevarianceofXis defined to beE[(X−μ)^2 ].
It is usually denoted byσ^2 or by Var(X).


It is worthwhile to observe that Var(X)equals

σ^2 =E[(X−μ)^2 ]=E(X^2 − 2 μX+μ^2 ).

BecauseEis a linear operator it then follows that


σ^2 = E(X^2 )− 2 μE(X)+μ^2
= E(X^2 )− 2 μ^2 +μ^2
= E(X^2 )−μ^2.

This frequently affords an easier way of computing the variance ofX.
Free download pdf