Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
2.4. Independent Random Variables 123

Proof.IfX 1 andX 2 are independent, then

M(t 1 ,t 2 )=E(et^1 X^1 +t^2 X^2 )
= E(et^1 X^1 et^2 X^2 )
= E(et^1 X^1 )E(et^2 X^2 )
= M(t 1 ,0)M(0,t 2 ).

Thus the independence ofX 1 andX 2 implies that the mgf of the joint distribution
factors into the product of the moment-generating functions of the two marginal
distributions.
Suppose next that the mgf of the joint distribution ofX 1 andX 2 is given by
M(t 1 ,t 2 )=M(t 1 ,0)M(0,t 2 ). NowX 1 has the unique mgf, which, in the continuous
case, is given by
M(t 1 ,0) =


∫∞

−∞

et^1 x^1 f 1 (x 1 )dx 1.

Similarly, the unique mgf ofX 2 , in the continuous case, is given by

M(0,t 2 )=

∫∞

−∞

et^2 x^2 f 2 (x 2 )dx 2.

Thus we have

M(t 1 ,0)M(0,t 2 )=

[∫∞

−∞

et^1 x^1 f 1 (x 1 )dx 1

][∫∞

−∞

et^2 x^2 f 2 (x 2 )dx 2

]

=

∫∞

−∞

∫∞

−∞

et^1 x^1 +t^2 x^2 f 1 (x 1 )f 2 (x 2 )dx 1 dx 2.

We are given thatM(t 1 ,t 2 )=M(t 1 ,0)M(0,t 2 ); so


M(t 1 ,t 2 )=

∫∞

−∞

∫∞

−∞

et^1 x^1 +t^2 x^2 f 1 (x 1 )f 2 (x 2 )dx 1 dx 2.

ButM(t 1 ,t 2 )isthemgfofX 1 andX 2 .Thus

M(t 1 ,t 2 )=

∫∞

−∞

∫∞

−∞

et^1 x^1 +t^2 x^2 f(x 1 ,x 2 )dx 1 dx 2.

The uniqueness of the mgf implies that the two distributions of probability that are
described byf 1 (x 1 )f 2 (x 2 )andf(x 1 ,x 2 )arethesame. Thus

f(x 1 ,x 2 )≡f 1 (x 1 )f 2 (x 2 ).

That is, ifM(t 1 ,t 2 )=M(t 1 ,0)M(0,t 2 ), thenX 1 andX 2 are independent. This
completes the proof when the random variables are of the continuous type. With
random variables of the discrete type, the proof is made by using summation instead
of integration.

Free download pdf