Introduction to Probability and Statistics for Engineers and Scientists

(Sean Pound) #1

*9.10Multiple Linear Regression 395


To determine the least squares estimators, we repeatedly take partial derivatives of the
preceding sum of squares first with respect toB 0 , then toB 1 ,..., then toBk. On equating
thesek+1 equations to 0, we obtain the following set of equations:


∑n

i= 1

(Yi−B 0 −B 1 xi 1 −B 2 xi 2 −···−Bkxik)= 0

∑n

i= 1

xi 1 (Yi−B 0 −B 1 xi 1 −···−Bkxik)= 0

∑n

i= 1

xi 2 (Yi−B 0 −B 1 xi 1 −···−Bkxik)= 0

..
.
∑n

i= 1

xik(Yi−B 0 −B 1 xi 1 −···−Bixik)= 0

Rewriting these equations yields that the least squares estimatorsB 0 ,B 1 ,...,Bksatisfy
the following set of linear equations, called thenormal equations:


∑n

i= 1

Yi=nB 0 +B 1

∑n

i= 1

xi 1 +B 2

∑n

i= 1

xi 2 +···+Bk

∑n

i= 1

xik (9.10.1)

∑n

i= 1

xi 1 Yi=B 0

∑n

i= 1

xi 1 +B 1

∑n

i= 1

xi^21 +B 2

∑n

i= 1

xi 1 xi 2 +···+Bk

∑n

i= 1

xi 1 xik

..
.

∑k

i= 1

xikYi=B 0

∑n

i= 1

xik+B 1

∑n

i= 1

xikxi 1 +B 2

∑n

i= 1

xikxi 2 +···+Bk

∑n

i= 1

xik^2

Before solving the normal equations, it is convenient to introduce matrix notation. If
we let


Y=





Y 1
Y 2
..
.
Yn



, X=





1 x 11 x 12 ··· x 1 k
1 x 21 x 22 ··· x 2 k
..
.

..
.

..
.

..
.
1 xn 1 xn 2 ··· xnk




Free download pdf