Introduction to Probability and Statistics for Engineers and Scientists

(Sean Pound) #1

9.8Weighted Least Squares 385


On taking partial derivatives with respect toAandBand setting them equal to 0, we
obtain the following equations for the minimizingAandB.



i

wiYi=A


i

wi+B


i

wixi (9.8.1)


i

wixiYi=A


i

wixi+B


i

wixi^2

These equations are easily solved to yield the least squares estimators.


EXAMPLE 9.8a To develop a feel as to why the estimators should be obtained by mini-
mizing the weighted sum of squares rather than the ordinary sum of squares, consider
the following situation. Suppose thatX 1 ,...,Xnare independent normal random vari-
ables each having meanμand varianceσ^2. Suppose further that theXiare not directly
observable but rather onlyY 1 andY 2 , defined by


Y 1 =X 1 +···+Xk, Y 2 =Xk+ 1 +···+Xn, k<n

are directly observable. Based onY 1 andY 2 , how should we estimateμ?
Whereas the best estimator ofμis clearlyX=


∑n
i= 1 Xi/n=(Y^1 +Y^2 )/n, let us see
what the ordinary least squares estimator would be. Since


E[Y 1 ]=kμ, E[Y 2 ]=(n−k)μ

the least squares estimator ofμwould be that value ofμthat minimizes


(Y 1 −kμ)^2 +(Y 2 −[n−k]μ)^2

On differentiating and setting equal to zero, we see that the least squares estimator of
μ— call itμˆ— is such that


− 2 k(Y 1 −kμˆ)−2(n−k)[Y 2 −(n−k)μˆ]= 0

or


[k^2 +(n−k)^2 ]ˆμ=kY 1 +(n−k)Y 2

or


μˆ=

kY 1 +(n−k)Y 2
k^2 +(n−k)^2

Thus we see that while the ordinary least squares estimator is an unbiased estimator of
μ— since


E[ˆμ]=

kE[Y 1 ]+(n−k)E[Y 2 ]
k^2 +(n−k)^2

=

k^2 μ+(n−k)^2 μ
k^2 +(n−k)^2


it is not the best estimatorX.

Free download pdf