Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
9.7. A Test of Independence 551

(c)As depicted in Figure 9.6.3, show that the angle between the vectorsˆθandˆe
is a right angle.

(d)Show that the residuals sum to zero; i.e., 1 ′ˆe=0.

9.6.14.Fity=a+xto the data


x 012
y 134

by the method of least squares.

9.6.15.Fit by the method of least squares the planez=a+bx+cyto the five
points (x, y, z):(− 1 ,− 2 ,5),(0,− 2 ,4),(0, 0 ,4),(1, 0 ,2),(2, 1 ,0).
Let the R vectorsx,y,zcontain the values forx, y,andz. Then the LS fit is
computed bylm(z~x+y).

9.6.16.Let the 4×1matrixY be multivariate normalN(Xβ,σ^2 I), where the
4 ×3matrixXequals


X=





112
1 − 12
10 − 3
10 − 1





andβis the 3×1 regression coefficient matrix.


(a)Find the mean matrix and the covariance matrix ofβˆ=(X′X)−^1 X′Y.

(b)If we observeY′to be equal to (6, 1 , 11 ,3), computeβˆ.

9.6.17.SupposeY is ann×1 random vector,Xis ann×pmatrix of known
constants of rankp,andβis ap×1 vector of regression coefficients. LetY have a
N(Xβ,σ^2 I) distribution. Obtain the pdf ofβˆ=(X′X)−^1 X′Y.


9.6.18.Let the independent normal random variablesY 1 ,Y 2 ,...,Ynhave, respec-
tively, the probability density functionsN(μ, γ^2 x^2 i),i=1, 2 ,...,n,wherethegiven
x 1 ,x 2 ,...,xnare not all equal and no one of which is zero. Discuss the test of
the hypothesisH 0 :γ=1,μunspecified, against all alternativesH 1 :γ =1,μ
unspecified.


9.7 ATestofIndependence

LetX andY have a bivariate normal distribution with meansμ 1 andμ 2 ,posi-
tive variancesσ^21 andσ^22 , and correlation coefficientρ.Wewishtotestthehy-
pothesis thatXandY are independent. Because two jointly normally distributed
random variables are independent if and only if ρ= 0, we test the hypothesis
H 0 :ρ= 0 against the hypothesisH 1 :ρ = 0. A likelihood ratio test is used.
Let (X 1 ,Y 1 ),(X 2 ,Y 2 ),...,(Xn,Yn) denote a random sample of sizen>2fromthe

Free download pdf