Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
200 Some Special Distributions

Consider the random vectorZ=(Z 1 ,...,Zn)′,whereZ 1 ,...,Znare iidN(0,1)
random variables. Then the density ofZis


fZ(z)=

∏n

i=1

1

2 π

exp

{

1
2

zi^2

}
=

(
1
2 π

)n/ 2
exp

{

1
2

∑n

i=1

z^2 i

}

=

(
1
2 π

)n/ 2
exp

{

1
2

z′z

}
, (3.5.5)

forz∈Rn. Because theZis have mean 0, have variance 1, and are uncorrelated,
the mean and covariance matrix ofZare


E[Z]= 0 and Cov[Z]=In, (3.5.6)

whereIndenotes the identity matrix of ordern. Recall that the mgf ofZievaluated
attiis exp{t^2 i/ 2 }. Hence, because theZis are independent, the mgf ofZis


MZ(t)=E[exp{t′Z}]=E

[n

i=1

exp{tiZi}

]
=

∏n

i=1

E[exp{tiZi}]

=exp

{
1
2

∑n

i=1

t^2 i

}
=exp

{
1
2

t′t

}
, (3.5.7)

for allt∈Rn.WesaythatZhas amultivariate normal distributionwith
mean vector 0 and covariance matrixIn. We abbreviate this by saying thatZhas
anNn( 0 ,In) distribution.
For the general case, supposeΣis ann×n, symmetric, and positive semi-definite
matrix. Then from linear algebra, we can always decomposeΣas


Σ=Γ′ΛΓ, (3.5.8)

whereΛis the diagonal matrixΛ=diag(λ 1 ,λ 2 ,...,λn),λ 1 ≥λ 2 ≥···≥λn≥ 0
are the eigenvalues ofΣ, and the columns ofΓ′,v 1 ,v 2 ,...,vn, are the corresponding
eigenvectors. This decomposition is called thespectral decompositionofΣ.The
matrixΓis orthogonal, i.e.,Γ−^1 =Γ′, and, hence,ΓΓ′=I. As Exercise 3.5.19
shows, we can write the spectral decomposition in another way, as


Σ=Γ′ΛΓ=

∑n

i=1

λivivi′. (3.5.9)

Because theλis are nonnegative, we can define the diagonal matrixΛ^1 /^2 =
diag{



λ 1 ,...,


λn}.Then the orthogonality ofΓimplies

Σ=[Γ′Λ^1 /^2 Γ][Γ′Λ^1 /^2 Γ].

We define the matrix product in brackets as thesquare rootof the positive semi-
definite matrixΣand write it as

Σ^1 /^2 =Γ′Λ^1 /^2 Γ. (3.5.10)
Free download pdf