Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
564 Inferences About Normal Linear Models

for (t 1 ,t 2 )inanopenneighborhoodof(0,0).
The coefficient of (− 2 t 1 )ron the right side of (9.9.11) isλ 1 ···λr|I− 2 t 2 D|.Itis
not so easy to find the coefficient of (− 2 t 1 )rin the left side of the equation (9.9.11).
Conceive of expanding this determinant in terms of minors of orderrformed from
the firstrcolumns. One term in this expansion is the product of the minor of order
rin the upper left-hand corner, namely,|Ir− 2 t 1 Λ 11 − 2 t 2 D 11 |, and the minor of
ordern−rin the lower right-hand corner, namely,|In−r− 2 t 2 D 22 |.Moreover,this
product is the only term in the expansion of the determinant that involves (− 2 t 1 )r.
Thus the coefficient of (− 2 t 1 )rin the left-hand member of Equation (9.9.11) is
λ 1 ···λr|In−r− 2 t 2 D 22 |. If we equate these coefficients of (− 2 t 1 )r,wehave


|I− 2 t 2 D|=|In−r− 2 t 2 D 22 |, (9.9.12)

fort 2 in an open neighborhood of 0. Equation (9.9.12) implies that the nonzero
eigenvalues of the matricesDandD 22 are the same (see Exercise 9.9.8). Recall
that the sum of the squares of the eigenvalues of a symmetric matrix is equal to the
sum of the squares of the elements of that matrix (see Exercise 9.8.8). Thus the
sum of the squares of the elements of matrixDis equal to the sum of the squares
of the elements ofD 22. Since the elements of the matrixDare real, it follows that
each of the elements ofD 11 ,D 12 ,andD 21 iszero. Hencewecanwrite


0 =Λ 1 D=Γ 1 AΓ′ 1 Γ 1 BΓ′ 1

becauseΓ 1 is an orthogonal matrix,AB= 0.

Remark 9.9.2.Theorem 9.9.1 remains valid if the random sample is from a distri-
bution that isN(μ, σ^2 ), whatever the real value ofμ. Moreover, Theorem 9.9.1 may
be extended to quadratic forms in random variables that have a joint multivariate
normal distribution with a positive definite covariance matrixΣ. The necessary and
sufficient condition for the independence of two such quadratic forms with symmet-
ric matricesAandBthen becomesAΣB= 0. In our Theorem 9.9.1, we have
Σ=σ^2 I,sothatAΣB=Aσ^2 IB=σ^2 AB= 0.


The following theorem is from Hogg and Craig (1958).

Theorem 9.9.2(Hogg and Craig).Define the sumQ=Q 1 +···+Qk− 1 +Qk,
whereQ, Q 1 ,...,Qk− 1 ,Qkarek+1random variables that are quadratic forms in the
observations of a random sample of sizenfrom a distribution that isN(0,σ^2 ).Let
Q/σ^2 beχ^2 (r),letQi/σ^2 beχ^2 (ri),i=1, 2 ,...,k− 1 ,andletQkbe nonnegative.
Then the random variablesQ 1 ,Q 2 ,...,Qkare independent and, hence,Qk/σ^2 is
χ^2 (rk=r−r 1 −···−rk− 1 ).


Proof: Take first the case ofk= 2 and let the real symmetric matricesQ, Q 1 ,and
Q 2 be denoted, respectively, byA,A 1 ,A 2 .WearegiventhatQ=Q 1 +Q 2 or,
equivalently, thatA=A 1 +A 2. We are also given thatQ/σ^2 isχ^2 (r)andthat
Q 1 /σ^2 isχ^2 (r 1 ). In accordance with Theorem 9.8.4, we haveA^2 =AandA^21 =A.

Free download pdf