Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
142 Multivariate Distributions

Proof: Use Theorem 2.6.2 to derive (2.6.15); i.e.,

Cov(X)=E[(X−μ)(X−μ)′]
= E[XX′−μX′−Xμ′+μμ′]
= E[XX′]−μE[X′]−E[X]μ′+μμ′,

which is the desired result. The proof of (2.6.16) is left as an exercise.


All variance-covariance matrices arepositive semi-definitematrices; that is,
a′Cov(X)a≥0, for all vectorsa∈Rn. To see this letXbe a random vector and
letabe anyn×1 vector of constants. ThenY=a′Xis a random variable and,
hence, has nonnegative variance; i.e.,


0 ≤Var(Y)=Var(a′X)=a′Cov(X)a; (2.6.17)

hence, Cov(X) is positive semi-definite.


EXERCISES


2.6.1.LetX, Y, Zhave joint pdff(x, y, z)=2(x+y+z)/ 3 , 0 <x< 1 , 0 <y<
1 , 0 <z<1, zero elsewhere.


(a)Find the marginal probability density functions ofX, Y,andZ.

(b)ComputeP(0<X<^12 , 0 <Y <^12 , 0 <Z<^12 )andP(0<X<^12 )=P(0<
Y<^12 )=P(0<Z<^12 ).

(c)AreX, Y,andZindependent?

(d)CalculateE(X^2 YZ+3XY^4 Z^2 ).

(e)Determine the cdf ofX, Y,andZ.

(f)Find the conditional distribution ofX andY,givenZ=z, and evaluate
E(X+Y|z).

(g)Determine the conditional distribution ofX,givenY =yandZ=z,and
computeE(X|y, z).

2.6.2.Letf(x 1 ,x 2 ,x 3 )=exp[−(x 1 +x 2 +x 3 )], 0 <x 1 <∞, 0 <x 2 <∞, 0 <
x 3 <∞, zero elsewhere, be the joint pdf ofX 1 ,X 2 ,X 3.


(a)ComputeP(X 1 <X 2 <X 3 )andP(X 1 =X 2 <X 3 ).

(b)Determine the joint mgf ofX 1 ,X 2 ,andX 3. Are these random variables
independent?

2.6.3.LetX 1 ,X 2 ,X 3 ,andX 4 be four independent random variables, each with
pdff(x)=3(1−x)^2 , 0 <x<1, zero elsewhere. IfYis the minimum of these four
variables, find the cdf and the pdf ofY.
Hint:P(Y>y)=P(Xi>y,i=1,...,4).
Free download pdf