the random column vector with components X 1 ,...,Xn, and let the means of
X 1 ,...,Xn be represented by the vector mX. A convenient representation of
their variances and covariances is the covariance matrix, , defined by
where the superscript T denotes the matrix transpose. The n n matrix has
a structure in which the diagonal elements are the variances and in which the
nondiagonal elements are covariances. Specifically, it is given by
In the above ‘var’ reads ‘variance of ’ and ‘cov’ reads ‘covariance of ’. Since
the covariance matrix is always symmetrical.
In closing, let us state (in Theorem 4.2) without proof an important result
which is a direct extension of Equation (4.28).
Theorem 4. 2: if X 1 ,X 2 ,...,Xn are mutually independent, then
where gj(Xj) is an arbitrary function of Xj. It is assumed, of co ur se, that all
indicated expectations exist.
4.4 Moments of Sums of Random Variables
Let X 1 ,X 2 ,...,Xn be n random variables. Their sum is also a random variable.
In this section, we are interested in the moments of this sum in terms of
those associated with 1, 2,... , n. These relations find applications
in a large number of derivations to follow and in a variety of physical
situations.
Consider
Let mj and^2 j denote the respective mean and variance of Xj. Results 4.1–4.3
are some of the important results concerning the mean and variance of Y.
Expectations and Moments 93
L
LEf
XmX
XmXTg;
4 : 34
L
L
var
X 1 cov
X 1 ;X 2 ... cov
X 1 ;Xn
cov
X 2 ;X 1 var
X 2 ... cov
X 2 ;Xn
... ... ... ...
cov
Xn;X 1 cov
Xn;X 2 ... var
Xn
2 6 6 6 6 4
3 7 7 7 7 5
: 4 : 35
covXi,Xj)covXj,Xi),
Efg 1
X 1 g 2
X 2 ...gn
XngEfg 1
X 1 gEfg 2
X 2 g...Efgn
Xng;
4 : 36
Xj,j
YX 1 X 2 Xn:
4 : 37