150 Multivariate Distributions
This, however, is the mgf of the pmf
pY(y)=
{
(μ 1 +μ 2 +μ 3 )ye−(μ^1 +μ^2 +μ^3 )
y! y=0,^1 ,^2 ...
0elsewhere,
soY=X 1 +X 2 +X 3 has this distribution.
Example 2.7.5.LetX 1 ,X 2 ,X 3 ,X 4 be independent random variables with com-
mon pdf
f(x)=
{
e−x x> 0
0elsewhere.
IfY=X 1 +X 2 +X 3 +X 4 , then similar to the argument in the last example, the
independence ofX 1 ,X 2 ,X 3 ,X 4 implies that
E
(
etY
)
=E
(
etX^1
)
E
(
etX^2
)
E
(
etX^3
)
E
(
etX^4
)
.
In Section 1.9, we saw that
E
(
etXi
)
=(1−t)−^1 ,t< 1 ,i=1, 2 , 3 , 4.
Hence,
E
(
etY
)
=(1−t)−^4.
In Section 3.3, we find that this is the mgf of a distribution with pdf
fY(y)=
{ 1
3!y
(^3) e−y 0 <y<∞
0elsewhere.
Accordingly,Yhas this distribution.
EXERCISES
2.7.1.LetX 1 ,X 2 ,X 3 be iid, each with the distribution having pdff(x)=e−x, 0 <
x<∞, zero elsewhere. Show that
Y 1 =
X 1
X 1 +X 2
,Y 2 =
X 1 +X 2
X 1 +X 2 +X 3
,Y 3 =X 1 +X 2 +X 3
are mutually independent.
2.7.2.Iff(x)=^12 ,− 1 <x<1, zero elsewhere, is the pdf of the random variable
X, find the pdf ofY=X^2.
2.7.3.IfXhas the pdf off(x)=^14 ,− 1 <x<3, zero elsewhere, find the pdf of
Y=X^2.
Hint: HereT ={y:0≤y< 9 }and the eventY∈Bis the union of two mutually
exclusive events ifB={y:0<y< 1 }.
2.7.4.LetX 1 ,X 2 ,X 3 be iid with common pdff(x)=e−x,x> 0 ,0elsewhere.
Find the joint pdf ofY 1 =X 1 ,Y 2 =X 1 +X 2 ,andY 3 =X 1 +X 2 +X 3.