Introduction to Probability and Statistics for Engineers and Scientists

(Sean Pound) #1

394 Chapter 9: Regression


REMARK


In matrix notation Equation 9.9.1 can be written as




1,291.1
9,549.3
77,758.9


=



10 55 385
55 385 3, 025
385 3,025 25,333





B 0
B 1
B 2



which has the solution




B 0
B 1
B 2


=



10 55 385
55 385 3,025
385 3,025 25,333



− 1 

1,291.1
9,549.3
77,758.9



*9.10Multiple Linear Regression


In the majority of applications, the response of an experiment can be predicted more
adequately not on the basis of a single independent input variable but on a collection of
such variables. Indeed, a typical situation is one in which there are a set of, say,kinput
variables and the responseYis related to them by the relation


Y=β 0 +β 1 x 1 +···+βkxk+e

wherexj,j=1,...,kis the level of thejth input variable andeis a random error that
we shall assume is normally distributed with mean 0 and (constant) varianceσ^2. The
parametersβ 0 ,β 1 ,...,βkandσ^2 are assumed to be unknown and must be estimated
from the data, which we shall suppose will consist of the values ofY 1 ,...,YnwhereYiis
the response level corresponding to thekinput levelsxi 1 ,...,xi 2 ,...,xik. That is, theYi
are related to these input levels through


E[Yi]=β 0 +β 1 xi 1 +β 2 xi 2 +···+βkxik

If we letB 0 ,B 1 ,...,Bkdenote estimators ofβ 0 ,...,βk, then the sum of the squared
differences between theYiand their estimated expected values is


∑n

i= 1

(Yi−B 0 −B 1 xi 1 −B 2 xi 2 −···−Bkxik)^2

The least squares estimators are those values ofB 0 ,B 1 ,...,Bkthat minimize the foregoing.


* Optional section.
Free download pdf