Introduction to Probability and Statistics for Engineers and Scientists

(Sean Pound) #1

232 Chapter 7: Parameter Estimation


Upon equating to zero and solving, we obtain that the maximum likelihood estimatepˆ
satisfies


∑n
1

xi


=

n−

∑n
1

xi

1 −pˆ

or


pˆ=

∑n
i= 1

xi

n

Hence, the maximum likelihood estimator of the unknown mean of a Bernoulli
distribution is given by


d(X 1 ,...,Xn)=

∑n
i= 1

Xi

n

Since


∑n
i= 1 Xiis the number of successful trials, we see that the maximum likelihood
estimator ofpis equal to the proportion of the observed trials that result in successes.
For an illustration, suppose that each RAM (random access memory) chip produced by
a certain manufacturer is, independently, of acceptable quality with probabilityp. Then
if out of a sample of 1,000 tested 921 are acceptable, it follows that the maximum likelihood
estimate ofpis .921. ■


EXAMPLE 7.2b Two proofreaders were given the same manuscript to read. If proofreader
1 foundn 1 errors, and proofreader 2 foundn 2 errors, withn1,2of these errors being found
by both proofreaders, estimateN, the total number of errors that are in the manuscript.


SOLUTION Before we can estimateNwe need to make some assumptions about the
underlying probability model. So let us assume that the results of the proofreaders are
independent, and that each error in the manuscript is independently found by proofreader
iwith probabilitypi,i=1, 2.
To estimateN, we will start by deriving an estimator ofp 1. To do so, note that each
of then 2 errors found by reader 2 will, independently, be found by proofreader 1 with
probabilitypi. Because proofreader 1 foundn1,2of thosen 2 errors, a reasonable estimate
ofp 1 is given by


pˆ 1 =

n1,2
n 2
Free download pdf