Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
2.3. Conditional Distributions and Expectations 115

which completes the proof.

Intuitively, this result could have this useful interpretation. Both the random
variablesX 2 andE(X 2 |X 1 ) have the same meanμ 2. If we did not knowμ 2 ,we
could use either of the two random variables to guess at the unknownμ 2. Since,
however, Var(X 2 )≥Var[E(X 2 |X 1 )], we would put more reliance inE(X 2 |X 1 )asa
guess. That is, if we observe the pair (X 1 ,X 2 )tobe(x 1 ,x 2 ), we could prefer to use
E(X 2 |x 1 )tox 2 as a guess at the unknownμ 2. When studying the use of sufficient
statistics in estimation in Chapter 7, we make use of this famous result, attributed
to C. R. Rao and David Blackwell.
We finish this section with an example illustrating Theorem 2.3.1.


Example 2.3.3.LetX 1 andX 2 be discrete random variables. Suppose the condi-
tional pmf ofX 1 givenX 2 and the marginal distribution ofX 2 are given by


p(x 1 |x 2 )=

(
x 2
x 1

)(
1
2

)x 2
,x 1 =0, 1 ,...,x 2

p(x 2 )=

2
3

(
1
3

)x 2 − 1
,x 2 =1, 2 , 3 ....

Let us determine the mgf ofX 1 .Forfixedx 2 , by the binomial theorem,

E

(
etX^1 |x 2

)
=

∑x^2

x 1 =0

(
x 2
x 1

)
etx^1

(
1
2

)x 2 −x 1 (
1
2

)x 1

=

(
1
2

+

1
2

et

)x 2
.

Hence, by the geometric series and Theorem 2.3.1,


E

(
etX^1

)
= E

[
E

(
etX^1 |X 2

)]

=

∑∞

x 2 =1

(
1
2

+

1
2

et

)x 2
2
3

(
1
3

)x 2 − 1

=

2
3

(
1
2

+

1
2

et

)∑∞

x 2 =1

(
1
6

+

1
6

et

)x 2 − 1

=

2
3

(
1
2

+

1
2

et

)
1
1 −[(1/6) + (1/6)et]

,

provided (1/6) + (1/6)et<1ort<log 5 (which includest=0).

EXERCISES

2.3.1.LetX 1 andX 2 have the joint pdff(x 1 ,x 2 )=x 1 +x 2 , 0 <x 1 < 1 , 0 <
x 2 <1, zero elsewhere. Find the conditional mean and variance ofX 2 ,given
X 1 =x 1 , 0 <x 1 <1.

Free download pdf