If random variables X and Y are independent, then we also have
To show the above, we simply substitute fX(x)f (y) for f (^) XY (x,y) in Equation
(4.83). Thedouble integral on the right-hand side separates, and we have
and we have the desired result.
Analogous to the one-random-variable case, joint characteristic function
XY (t, s) is often called on to determine joint density function f^ XY (x, y) of X
and Y and their jo int moments. The density function fXY (x,y) is uniquely
determined in terms of XY (t,s) by the two-dimensional Fourier transform
and moments if they exist, are related to XY (t, s) by
The MacLaurin series expansion of XY (t,s) thus takes the form
The above development can be generalized to the case of more than two
random variables in an obvious manner.
Example 4.18.Let us consider again the Brownian motion problem discussed
in Example 4.17, and form two random variables
Expectations and Moments 109
XY
t;sX
tY
s:
4 : 86
Y
XY
t;s
Z 1
1
ejtxfX
xdx
Z 1
1
ejsyfY
ydy
X
tY
s;
fXY
x;y
1
4 ^2
Z 1
1
Z 1
1
ej
txsyXY
t;sdtds;
4 : 87
EfXnYmg (^) nm,
qnm
qtnqsm
XY
t;s
t;s 0
jnm
Z 1
1
Z 1
1
xnymfXY
x;ydxdy
jnm (^) nm:
4 : 88
XY
t;s
X^1
i 0
X^1
k 0
(^) ik
i!k!
jti
jsk:
4 : 89
X^0 andY^0 as
X^0 X 1 X 2 X 2 n;
Y^0 Xn 1 Xn 2 X 3 n: