Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
9.6. A Regression Problem 541

y

x
xi

yi

y^i

Figure 9.6.1:The plot shows the least squares fitted line (solid line) to a set of
data. The dashed-line segment from (xi,ˆyi)to(xi,yi) shows the deviation of (xi,yi)
from its fit.


To maximizeL(α, β, σ^2 ), or, equivalently, to minimize

−logL(α, β, σ^2 )=
n
2

log(2πσ^2 )+

∑n
i=1[yi−α−β(xi−x)]
2
2 σ^2

,

we must selectαandβto minimize

H(α, β)=

∑n

i=1

[yi−α−β(xi−x)]^2.

Since|yi−α−β(xi−x)|=|yi−μ(xi)|is the vertical distance from the point
(xi,yi) to the liney=μ(x) (see the dashed-line segment in Figure 9.6.1), we note
thatH(α, β) represents the sum of the squares of those distances. Thus, selecting
αandβso that the sum of the squares is minimized means that we are fitting the
straight line to the data by themethod of least squares(LS).
To minimizeH(α, β), we find the two first partial derivatives,


∂H(α, β)
∂α
=2

∑n

i=1

[yi−α−β(xi−x)](−1)

and

∂H(α, β)
∂β

=2

∑n

i=1

[yi−α−β(xi−x)][−(xi−x)].
Free download pdf