Introduction to Probability and Statistics for Engineers and Scientists

(Sean Pound) #1

358 Chapter 9: Regression


Using Equation 9.3.1 along with the relationship

A=

∑n

i= 1

Yi
n

−Bx

shows thatAcan also be expressed as a linear combination of the independent normal
random variablesYi,i=1,...,n and is thus also normally distributed. Its mean is
obtained from


E[A]=

∑n

i= 1

E[Yi]
n

−xE[B]

=

∑n

i= 1

(α+βxi)
n

−xβ

=α+βx−xβ

ThusAis also an unbiased estimator. The variance ofAis computed by first expressing
Aas a linear combination of theYi. The result (whose details are left as an exercise) is that


Var(A)=

σ^2

∑n
i= 1

xi^2

n

(n

i= 1

xi^2 −nx^2

) (9.3.3)

The quantitiesYi−A−Bxi,i=1,...,n, which represent the differences between the
actual responses (that is, theYi) and their least squares estimators (that is,A+Bxi) are
called theresiduals. The sum of squares of the residuals


SSR=

∑n

i= 1

(Yi−A−Bxi)^2

can be utilized to estimate the unknown error varianceσ^2. Indeed, it can be shown that


SSR
σ^2

∼χn^2 − 2

That is,SSR/σ^2 has a chi-square distribution withn−2 degrees of freedom, which implies
that


E

[
SSR
σ^2

]
=n− 2
Free download pdf