Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
2.7. Transformations for Several Random Variables 149

The value of the first Jacobian is

J 1 =

∣ ∣ ∣ ∣ ∣ ∣
1
2


y 2 /y (^112)

y 1 /y 2
1
2

(1−y 2 )/y 1 −^12

y 1 /(1−y 2 )
∣ ∣ ∣ ∣ ∣ ∣ =
1
4
{


1 −y 2
y 2


y 2
1 −y 2
}
=−
1
4
1

y 2 (1−y 2 )
.
It is easy to see that the absolute value of each of the four Jacobians equals
1 / 4

y 2 (1−y 2 ). Hence, the joint pdf ofY 1 andY 2 is the sum of four terms and can
be written as
g(y 1 ,y 2 )=4
1
π
1
4

y 2 (1−y 2 )


1
π

y 2 (1−y 2 )
, (y 1 ,y 2 )∈T.
ThusY 1 andY 2 are independent random variables by Theorem 2.4.1.
Of course, as in the bivariate case, we can use the mgf technique by noting that
ifY=g(X 1 ,X 2 ,...,Xn) is a function of the random variables, then the mgf ofY
is given by
E
(
etY
)


∫∞
−∞
∫∞
−∞
···
∫∞
−∞
etg(x^1 ,x^2 ,...,xn)f(x 1 ,x 2 ,...,xn)dx 1 dx 2 ···dxn,
in the continuous case, wheref(x 1 ,x 2 ,...,xn) is the joint pdf. In the discrete case,
summations replace the integrals. This procedure is particularly useful in cases in
which we are dealing with linear functions of independent random variables.
Example 2.7.4(Extension of Example 2.2.6).LetX 1 ,X 2 ,X 3 be independent ran-
dom variables with joint pmf
p(x 1 ,x 2 ,x 3 )=
{
μ 1 x^1 μx 22 μx 33 e−μ^1 −μ^2 −μ^3
x 1 !x 2 !x 3! xi=0,^1 ,^2 ,... ,i=1,^2 ,^3
0elsewhere.
IfY=X 1 +X 2 +X 3 ,themgfofY is
E
(
etY
)
= E
(
et(X^1 +X^2 +X^3 )
)
= E
(
etX^1 etX^2 etX^3
)
= E
(
etX^1
)
E
(
etX^2
)
E
(
etX^3
)
,
because of the independence ofX 1 ,X 2 ,X 3. In Example 2.2.6, we found that
E
(
etXi
)
=exp{μi(et−1)},i=1, 2 , 3.
Hence,
E
(
etY
)
=exp{(μ 1 +μ 2 +μ 3 )(et−1)}.

Free download pdf