*9.10Multiple Linear Regression 395
To determine the least squares estimators, we repeatedly take partial derivatives of the
preceding sum of squares first with respect toB 0 , then toB 1 ,..., then toBk. On equating
thesek+1 equations to 0, we obtain the following set of equations:
∑ni= 1(Yi−B 0 −B 1 xi 1 −B 2 xi 2 −···−Bkxik)= 0∑ni= 1xi 1 (Yi−B 0 −B 1 xi 1 −···−Bkxik)= 0∑ni= 1xi 2 (Yi−B 0 −B 1 xi 1 −···−Bkxik)= 0..
.
∑ni= 1xik(Yi−B 0 −B 1 xi 1 −···−Bixik)= 0Rewriting these equations yields that the least squares estimatorsB 0 ,B 1 ,...,Bksatisfy
the following set of linear equations, called thenormal equations:
∑ni= 1Yi=nB 0 +B 1∑ni= 1xi 1 +B 2∑ni= 1xi 2 +···+Bk∑ni= 1xik (9.10.1)∑ni= 1xi 1 Yi=B 0∑ni= 1xi 1 +B 1∑ni= 1xi^21 +B 2∑ni= 1xi 1 xi 2 +···+Bk∑ni= 1xi 1 xik..
.∑ki= 1xikYi=B 0∑ni= 1xik+B 1∑ni= 1xikxi 1 +B 2∑ni= 1xikxi 2 +···+Bk∑ni= 1xik^2Before solving the normal equations, it is convenient to introduce matrix notation. If
we let
Y=
Y 1
Y 2
..
.
Yn
, X=
1 x 11 x 12 ··· x 1 k
1 x 21 x 22 ··· x 2 k
..
...
...
...
.
1 xn 1 xn 2 ··· xnk