138 Multivariate Distributions
for independent random variablesX 1 andX 2 becomes, for mutually independent
random variablesX 1 ,X 2 ,...,Xn,
E[u 1 (X 1 )u 2 (X 2 )···un(Xn)] =E[u 1 (X 1 )]E[u 2 (X 2 )]···E[un(Xn)],
or
E
[n
∏
i=1
ui(Xi)
]
=
∏n
i=1
E[ui(Xi)].
The moment-generating function (mgf) of the joint distribution ofnrandom
variablesX 1 ,X 2 ,...,Xnis defined as follows. Suppose that
E[exp(t 1 X 1 +t 2 X 2 +···+tnXn)]
exists for−hi<ti<hi,i=1, 2 ,...,n,whereeachhiis positive. This expectation
is denoted byM(t 1 ,t 2 ,...,tn) and it is called the mgf of the joint distribution of
X 1 ,...,Xn(or simply the mgf ofX 1 ,...,Xn). As in the cases of one and two
variables, this mgf is unique and uniquely determines the joint distribution of the
nvariables (and hence all marginal distributions). For example, the mgf of the
marginal distributions ofXiisM(0,..., 0 ,ti, 0 ,...,0),i=1, 2 ,...,n; that of the
marginal distribution ofXiandXjisM(0,..., 0 ,ti, 0 ,..., 0 ,tj, 0 ,...,0); and so on.
Theorem 2.4.5 of this chapter can be generalized, and the factorization
M(t 1 ,t 2 ,...,tn)=
∏n
i=1
M(0,..., 0 ,ti, 0 ,...,0) (2.6.6)
is a necessary and sufficient condition for the mutual independence ofX 1 ,X 2 ,...,Xn.
Note that we can write the joint mgf in vector notation as
M(t)=E[exp(t′X)], fort∈B⊂Rn,
whereB={t:−hi<ti<hi,i=1,...,n}.
The following is a theorem that proves useful in the sequel. It gives the mgf of
a linear combination of independent random variables.
Theorem 2.6.1.SupposeX 1 ,X 2 ,...,Xnarenmutually independent random vari-
ables. Suppose, for alli=1, 2 ,...,n,Xihas mgfMi(t),for−hi<t<hi,where
hi> 0 .LetT=
∑n
i=1kiXi,wherek^1 ,k^2 ,...,knare constants. ThenThas the
mgf given by
MT(t)=
∏n
i=1
Mi(kit), −min
i
{hi}<t<min
i
{hi}. (2.6.7)
Proof.Assumetis in the interval (−mini{hi},mini{hi}). Then, by independence,
MT(t)=E
[
e
Pn
i=1tkiXi
]
=E
[n
∏
i=1
e(tki)Xi
]
=
∏n
i=1
E
[
etkiXi
]
=
∏n
i=1
Mi(kit),
which completes the proof.