9.2Least Squares Estimators of the Regression Parameters 353
9.2Least Squares Estimators of the Regression Parameters
Suppose that the responsesYicorresponding to the input valuesxi,i=1,...,nare to be
observed and used to estimateαandβin a simple linear regression model. To determine
estimators ofαandβwe reason as follows: IfAis the estimator ofαandBofβ, then the
estimator of the response corresponding to the input variablexiwould beA+Bxi. Since
the actual response isYi, the squared difference is (Yi−A−Bxi)^2 , and so ifAandBare
the estimators ofαandβ, then the sum of the squared differences between the estimated
responses and the actual response values — call itSS— is given by
SS=∑ni= 1(Yi−A−Bxi)^2The method of least squares chooses as estimators ofαandβthe values ofAandBthat
minimizeSS. To determine these estimators, we differentiateSSfirst with respect toAand
then toBas follows:
∂SS
∂A=− 2∑ni= 1(Yi−A−Bxi)∂SS
∂B=− 2∑ni= 1xi(Yi−A−Bxi)Setting these partial derivatives equal to zero yields the following equations for the
minimizing valuesAandB:
∑ni= 1Yi=nA+B∑ni= 1xi (9.2.1)∑ni= 1xiYi=A∑ni= 1xi+B∑ni= 1xi^2The Equations 9.2.1 are known as thenormal equations.Ifwelet
Y=∑iYi/n, x=∑ixi/nthen we can write the first normal equation as
A=Y−Bx (9.2.2)