Fundamentals of Probability and Statistics for Engineers

(John Hannent) #1

If random variables X and Y are independent, then we also have


To show the above, we simply substitute fX(x)f (y) for f (^) XY (x,y) in Equation
(4.83). Thedouble integral on the right-hand side separates, and we have
and we have the desired result.
Analogous to the one-random-variable case, joint characteristic function
XY (t, s) is often called on to determine joint density function f^ XY (x, y) of X
and Y and their jo int moments. The density function fXY (x,y) is uniquely
determined in terms of XY (t,s) by the two-dimensional Fourier transform
and moments if they exist, are related to XY (t, s) by
The MacLaurin series expansion of XY (t,s) thus takes the form
The above development can be generalized to the case of more than two
random variables in an obvious manner.
Example 4.18.Let us consider again the Brownian motion problem discussed
in Example 4.17, and form two random variables
Expectations and Moments 109
XY…t;s†ˆX…t†Y…s†:… 4 : 86 †
Y
XY…t;s†ˆ


Z 1

1

ejtxfX…x†dx

Z 1

1

ejsyfY…y†dy

ˆX…t†Y…s†;





fXY…x;y†ˆ

1

4 ^2

Z 1

1

Z 1

1

ej…tx‡sy†XY…t;s†dtds; … 4 : 87 †

EfXnYmgˆ (^) nm, 
qn‡m
qtnqsm
XY…t;s†
t;sˆ 0
ˆjn‡m


Z 1

1

Z 1

1

xnymfXY…x;y†dxdy

ˆjn‡m (^) nm:


… 4 : 88 †



XY…t;s†ˆ

X^1

iˆ 0

X^1

kˆ 0

(^) ik
i!k!
…jt†i…js†k: … 4 : 89 †
X^0 andY^0 as
X^0 ˆX 1 ‡X 2 ‡‡X 2 n;
Y^0 ˆXn‡ 1 ‡Xn‡ 2 ‡‡X 3 n:


)

… 4 : 90 †
Free download pdf