2.6. Extension to Several Random Variables 137
pdf of the particular group ofkvariables, provided that the latter pdf is positive.
We remark that there are many other conditional probability density functions; for
instance, see Exercise 2.3.12.
Because a conditional pdf is the pdf of a certain number of random variables,
the expectation of a function of these random variables has been defined. To em-
phasize the fact that a conditional pdf is under consideration, such expectations
are called conditional expectations. For instance, the conditional expectation of
u(X 2 ,...,Xn), givenX 1 =x 1 , is, for random variables of the continuous type,
given by
E[u(X 2 ,...,Xn)|x 1 ]=
∫∞
−∞
···
∫∞
−∞
u(x 2 ,...,xn)f 2 ,...,n| 1 (x 2 ,...,xn|x 1 )dx 2 ···dxn
providedf 1 (x 1 )>0 and the integral converges (absolutely). A useful random
variable is given byh(X 1 )=E[u(X 2 ,...,Xn)|X 1 )].
The above discussion of marginal and conditional distributions generalizes to
random variables of the discrete type by using pmfs and summations instead of
integrals.
Let the random variablesX 1 ,X 2 ,...,Xnhave the joint pdff(x 1 ,x 2 ,...,xn)and
the marginal probability density functionsf 1 (x 1 ),f 2 (x 2 ),...,fn(xn), respectively.
The definition of the independence ofX 1 andX 2 is generalized to the mutual
independence ofX 1 ,X 2 ,...,Xnas follows: The random variablesX 1 ,X 2 ,...,Xn
are said to bemutually independentif and only if
f(x 1 ,x 2 ,...,xn)≡f 1 (x 1 )f 2 (x 2 )···fn(xn),
for the continuous case. In the discrete case,X 1 ,X 2 ,...,Xnare said to bemutually
independentif and only if
p(x 1 ,x 2 ,...,xn)≡p 1 (x 1 )p 2 (x 2 )···pn(xn).
SupposeX 1 ,X 2 ,...,Xnare mutually independent. Then
P(a 1 <X 1 <b 1 ,a 2 <X 2 <b 2 ,...,an<Xn<bn)
= P(a 1 <X 1 <b 1 )P(a 2 <X 2 <b 2 )···P(an<Xn<bn)
=
∏n
i=1
P(ai<Xi<bi),
where the symbol
∏n
i=1φ(i) is defined to be
∏n
i=1
φ(i)=φ(1)φ(2)···φ(n).
The theorem that
E[u(X 1 )v(X 2 )] =E[u(X 1 )]E[v(X 2 )]