Introduction to Probability and Statistics for Engineers and Scientists

(Sean Pound) #1

266 Chapter 7: Parameter Estimation


SOLUTION From Program 5.8.1b (or Table A2), we see that


χ.025,20^2 =34.169, χ.975,20^2 =9.661

and so we can conclude, with 95 percent confidence, that


θ∈

(
3480
34.169

,

3480
9.661

)

or, equivalently,


θ∈(101.847, 360.211) ■

*7.7Evaluating a Point Estimator


LetX=(X 1 ,...,Xn) be a sample from a population whose distribution is specified up to
an unknown parameterθ, and letd=d(X) be an estimator ofθ. How are we to determine
its worth as an estimator ofθ? One way is to consider the square of the difference between
d(X) andθ. However, since (d(X)−θ)^2 is a random variable, let us agree to consider
r(d,θ), themean square errorof the estimatord, which is defined by


r(d,θ)=E[(d(X)−θ)^2 ]

as an indication of the worth ofdas an estimator ofθ.
It would be nice if there were a single estimatordthat minimizedr(d,θ) for all possible
values ofθ. However, except in trivial situations, this will never be the case. For example,
consider the estimatord∗defined by


d∗(X 1 ,...,Xn)= 4

That is, no matter what the outcome of the sample data, the estimatord∗chooses 4 as its
estimate ofθ. While this seems like a silly estimator (since it makes no use of the data), it
is, however, true that whenθactually equals 4, the mean square error of this estimator is 0.
Thus, the mean square error of any estimator different thand∗must, in most situations,
be larger than the mean square error ofd∗whenθ=4.
Although minimum mean square estimators rarely exist, it is sometimes possible to
find an estimator having the smallest mean square error among all estimators that satisfy
a certain property. One such property is that of unbiasedness.


* Optional section.
Free download pdf