Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
140 Multivariate Distributions

Obviously, ifi =j,wehave

pij(xi,xj)≡pi(xi)pj(xj),

and thusXiandXjare independent. However,


p(x 1 ,x 2 ,x 3 )
≡p 1 (x 1 )p 2 (x 2 )p 3 (x 3 ).

ThusX 1 ,X 2 ,andX 3 are not mutually independent.
Unless there is a possible misunderstanding betweenmutualandpairwiseinde-
pendence, we usually drop the modifiermutual. Accordingly, using this practice in
Example 2.6.2, we say thatX 1 ,X 2 ,X 3 are independent random variables, meaning
that they are mutually independent. Occasionally, for emphasis, we usemutually
independent so that the reader is reminded that this is different frompairwise in-
dependence.
In addition, if several random variables are mutually independent and have
the same distribution, we say that they areindependent and identically dis-
tributed, which we abbreviate asiid. So the random variables in Example 2.6.2
are iid with the common pdf given in expression (2.6.8).


The following is a useful corollary to Theorem 2.6.1 for iid random variables. Its
proof is asked for in Exercise 2.6.7.


Corollary 2.6.1. SupposeX 1 ,X 2 ,...,Xnare iid random variables with the com-
mon mgfM(t),for−h<t<h,whereh> 0 .LetT=


∑n
i=1Xi.ThenThas the
mgf given by
MT(t)=[M(t)]n, −h<t<h. (2.6.9)

2.6.1 ∗Multivariate Variance-Covariance Matrix

This section makes explicit use of matrix algebra and it is considered as an optional
section.
In Section 2.5 we discussed the covariance between two random variables. In
this section we want to extend this discussion to then-variate case. LetX=
(X 1 ,...,Xn)′be ann-dimensional random vector. Recall that we definedE(X)=
(E(X 1 ),...,E(Xn))′, that is, the expectation of a random vector is just the vector
of the expectations of its components. Now supposeWis anm×nmatrix of
random variables, say,W=[Wij] for the random variablesWij,1≤i≤mand
1 ≤j≤n. Note that we can always string out the matrix into anmn×1 random
vector. Hence, we define the expectation of a random matrix


E[W]=[E(Wij)]. (2.6.10)

As the following theorem shows, the linearity of the expectation operator easily
follows from this definition:


Theorem 2.6.2.LetW 1 andW 2 bem×nmatrices of random variables, letA 1
andA 2 bek×mmatrices of constants, and letBbe ann×lmatrix of constants.

Free download pdf