Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
9.9. The Independence of Certain Quadratic Forms 567

9.9.4.LetAbe the real symmetric matrix of a quadratic formQin the observations
of a random sample of sizenfrom a distribution that isN(0,σ^2 ). Given thatQ
and the meanXof the sample are independent, what can be said of the elements
of each row (column) ofA?


Hint: AreQandX
2
independent?


9.9.5.LetA 1 ,A 2 ,...,Akbe the matrices ofk>2 quadratic formsQ 1 ,Q 2 ,...,Qk
in the observations of a random sample of sizenfrom a distribution that isN(0,σ^2 ).
Prove that the pairwise independence of these forms implies that they are mutually
independent.
Hint: Show thatAiAj= 0 ,i =j,permitsE[exp(t 1 Q 1 +t 2 Q 2 +···+tkQk)] to
be written as a product of the mgfs ofQ 1 ,Q 2 ,...,Qk.


9.9.6.LetX′=[X 1 ,X 2 ,...,Xn], whereX 1 ,X 2 ,...,Xnare observations of a ran-
dom sample from a distribution that isN(0,σ^2 ). Letb′ =[b 1 ,b 2 ,...,bn]bea
real nonzero vector, and letAbe a real symmetric matrix of ordern.Provethat
the linear formb′Xand the quadratic formX′AXare independent if and only if
b′A= 0. Use this fact to prove thatb′XandX′AXare independent if and only
if the two quadratic forms (b′X)^2 =X′bb′XandX′AXare independent.


9.9.7.LetQ 1 andQ 2 be two nonnegative quadratic forms in the observations of a
random sample from a distribution that isN(0,σ^2 ). Show that another quadratic
formQis independent ofQ 1 +Q 2 if and only ifQis independent of each ofQ 1 and
Q 2.
Hint: Consider the orthogonal transformation that diagonalizes the matrix of
Q 1 +Q 2. After this transformation, what are the forms of the matricesQ, Q 1 and
Q 2 ifQandQ 1 +Q 2 are independent?


9.9.8.Prove that Equation (9.9.12) of this section implies that the nonzero eigen-
values of the matricesDandD 22 are the same.
Hint: Letλ=1/(2t 2 ),t 2
= 0, and show that Equation (9.9.12) is equivalent to
|D−λI|=(−λ)r|D 22 −λIn−r|.


9.9.9.HereQ 1 andQ 2 are quadratic forms in observations of a random sample from
N(0,1). IfQ 1 andQ 2 are independent and ifQ 1 +Q 2 has a chi-square distribution,
prove thatQ 1 andQ 2 are chi-square variables.


9.9.10.Often in regression the mean of the random variableYis a linear function
ofp-valuesx 1 ,x 2 ,...,xp,sayβ 1 x 1 +β 2 x 2 +···+βpxp,whereβ′=(β 1 ,β 2 ,...,βp)
are theregression coefficients. Suppose thatnvalues,Y′=(Y 1 ,Y 2 ,...,Yn), are
observed for thex-values inX=[xij], whereXis ann×pdesign matrixand its
ith row is associated withYi,i=1, 2 ,...,n. Assume thatYis multivariate normal
with meanXβand variance–covariance matrixσ^2 I,whereIis then×nidentity
matrix.


(a)Note thatY 1 ,Y 2 ,...,Ynare independent. Why?

(b)SinceYshould approximately equal its meanXβ,weestimateβby solving
thenormal equationsX′Y =X′Xβforβ. Assuming thatX′X is non-
singular, solve the equations to getβˆ=(X′X)−^1 X′Y. Show thatβˆhas a
Free download pdf