where I 1 is the modified first-order Bessel function of the first kind, and
As we can see, although likelihood equations can be established, they are
complicated functions of and and we must resort to numerical means for
their solutions. As we have pointed out earlier, this difficulty is often encoun-
tered when using the method of maximum likelihood. Indeed, Example 9.13
shows that the method of moments offers considerable computational advan-
tage in this case.
The variances of the maximum likelihood estimators for and can be
obtained, in principle, from Equations (9.119) and (9.120). We can also show
that their variances can be larger than those associated with the moment
estimators obtained in Example 9.13 for moderate sample sizes (see Benedict
and Soong,1967). This observation serves to remind us again that, although
maximum likelihood estimators possess optimal asymptotic properties, they
may perform poorly when the sample size is small.
9.3.2 Interval Estimation
We now examine another approach to the problem of parameter estimation. As
stated in the introductory text of Section 9.3, the interval estimation provides,
on the basis of a sample from a population, not only information on the
parameter values to be estimated, but also an indication of the level of con-
fidence that can be placed on possible numerical values of the parameters.
Before developing the theory of interval estimation, an example will be used
to demonstrate that a method that appears to be almost intuitively obvious
could lead to conceptual difficulties.
Suppose that five sample values –3, 2, 1.5, 0.5, and 2.1 – are observed from a
normal distribution having an unknown mean m and a known variance
From Example 9.15, we see that the MLE of m is the sample meanX and thus
Our additional task is to determine the upper and lower limits of an interval
such that, with a specified level of confidence, the true mean m will lie in this
interval.
The maximum likelihood estimator for m isX, which, being a sum of
normal random variables, is normal with mean m and variance
294 Fundamentals of Probability and Statistics for Engineers
yj
xj^
1 = 2
b^2
: 9 : 121
^ b^2 ,
^2
^2 9.
m^
1
5
3 2 1 : 5 0 : 5 2 : 1 1 : 82 : 9 : 122
^2 /n9/5.