The least-square estimates and , respectively, of and are found by
minimizing
In the above, the sample-value pairs are (x 1 ,y 1 ), (x 2 ,y 2 ),... , (n,xyn), and
ei,i 1,2,...,n, are called the residuals. Figure 11.1 gives a graphical presen-
tation of this procedure. We see that the residuals are the vertical distances
between the observed values of Y,yi, and the least-square estimate of
true regression line x.
The estimates and are easily found based on the least-square procedure.
The results are stated below as Theorem 11.1.
Theorem 11.1: consider the simple linear regression model defined by
Equation (11.4). Let (x 1 ,y 1 ), (x 2 ,y 2 ),...,(xn,yn) be observed sample values of Y
with associated values of x. Then the least-square estimates of and are
(xi ,yi)eiEstimated regression line:True regression line:yxFigure 11.1 The least squares method of estimationLinear Models and Linear Regression 337
^ ^
Q
Xni 1e^2 iXni 1yi
 ^  ^ xi^2 :
 11 : 6 
^ ^x^^y ^x; 
 11 : 7 ^
Xni 1
xix
yiy"
Xni 1
xix^2"# 1
; 11 : 8 
y = α +βxy = α + βx∧ ∧