Introduction to Probability and Statistics for Engineers and Scientists

(Sean Pound) #1

9.2Least Squares Estimators of the Regression Parameters 353


9.2Least Squares Estimators of the Regression Parameters


Suppose that the responsesYicorresponding to the input valuesxi,i=1,...,nare to be
observed and used to estimateαandβin a simple linear regression model. To determine
estimators ofαandβwe reason as follows: IfAis the estimator ofαandBofβ, then the
estimator of the response corresponding to the input variablexiwould beA+Bxi. Since
the actual response isYi, the squared difference is (Yi−A−Bxi)^2 , and so ifAandBare
the estimators ofαandβ, then the sum of the squared differences between the estimated
responses and the actual response values — call itSS— is given by


SS=

∑n

i= 1

(Yi−A−Bxi)^2

The method of least squares chooses as estimators ofαandβthe values ofAandBthat
minimizeSS. To determine these estimators, we differentiateSSfirst with respect toAand
then toBas follows:


∂SS
∂A

=− 2

∑n

i= 1

(Yi−A−Bxi)

∂SS
∂B

=− 2

∑n

i= 1

xi(Yi−A−Bxi)

Setting these partial derivatives equal to zero yields the following equations for the
minimizing valuesAandB:


∑n

i= 1

Yi=nA+B

∑n

i= 1

xi (9.2.1)

∑n

i= 1

xiYi=A

∑n

i= 1

xi+B

∑n

i= 1

xi^2

The Equations 9.2.1 are known as thenormal equations.Ifwelet


Y=


i

Yi/n, x=


i

xi/n

then we can write the first normal equation as


A=Y−Bx (9.2.2)
Free download pdf