Applied Statistics and Probability for Engineers

(Chris Devlin) #1
12-1 MULTIPLE LINEAR REGRESSION MODEL 417

12-1.3 Matrix Approach to Multiple Linear Regression

In fitting a multiple regression model, it is much more convenient to express the mathe-
matical operations using matrix notation.Suppose that there are kregressor variables and
nobservations, (xi 1 , xi 2 ,p, xik, yi), i1, 2,p, nand that the model relating the regres-
sors to the response is

This model is a system of nequations that can be expressed in matrix notation as

yX (12-11)
where

and 

In general, yis an (n 1) vector of the observations, Xis an (n p) matrix of the levels
of the independent variables, is a (p 1) vector of the regression coefficients, and is a
(n 1) vector of random errors.
We wish to find the vector of least squares estimators, ˆ, that minimizes

The least squares estimator ˆ is the solution for in the equations

We will not give the details of taking the derivatives above; however, the resulting equations
that must be solved are

L


 0

La

n

i 1

̨^2 i¿ 1 yX 2 ¿ 1 yX 2

 ≥

 1
 2
o
n

≥ ¥

 0
 1
o
k

X ≥ ¥

1 x 11 x 12 p x 1 k
1 x 21 x 22 p x 2 k
oooo
1 xn 1 xn 2 p xnk

y≥ ¥

y 1
y 2
o
yn

¥

yi 0  1 xi 1  2 xi 2 pkxiki i1, 2,p, n

XXˆ Xy (12-12)

Equations 12-12 are the least squares normal equations in matrix form. They are identical to
the scalar form of the normal equations given earlier in Equations 12-10. To solve the normal
equations, multiply both sides of Equations 12-12 by the inverse of Therefore, the least
squares estimate of is

X¿X.

ˆ (XX)^1 Xy (12-13)

c 12 .qxd 5/20/02 9:31 M Page 417 RK UL 6 RK UL 6:Desktop Folder:TEMP WORK:MONTGOMERY:REVISES UPLO D CH114 FIN L:Quark Files:

Free download pdf