Mathematical Methods for Physics and Engineering : A Comprehensive Guide

(Darren Dugan) #1

STATISTICS


and the matrixV−^1 is given by


(
V−^1

)
ij=−

∂^2 lnL
∂ai∂aj





a=ˆa

.

Moreover, in the limit of largeN, this matrix tends to the Fisher matrix given in


(31.36), i.e.V−^1 →F. Hence ML estimators areasymptotically minimum-variance.


Comparison of the above results with those in subsection 31.3.6 shows that

the large-sample limit of the likelihood functionL(x;a) has the same form as the


large-sample limit of the joint estimator sampling distributionP(aˆ|a). The only


difference is thatP(aˆ|a) is centred inaˆ-space on the true valuesaˆ=awhereas


L(x;a) is centred ina-space on the ML estimatesa=aˆ. From figure 31.4 and its


accompanying discussion, we therefore conclude that, in the large-sample limit,


the Bayesian and classical confidence limits on the parameterscoincide.


31.5.7 Extended maximum-likelihood method

It is sometimes the case that the number of data itemsNin our sample is itself a


random variable. Such experiments are typically those in which data are collected


for a certain period of time during which events occur at random in some way,


as opposed to those in which a prearranged number of data items are collected.


In particular, let us consider the case where the sample valuesx 1 ,x 2 ,...,xNare


drawn independently from some distributionP(x|a) and the sample sizeNis a


random variable described by a Poisson distribution with meanλ,i.e.N∼Po(λ).


The likelihood function in this case is given by


L(x,N;λ,a)=

λN
N!

e−λ

∏N

i=1

P(xi|a), (31.88)

and is often called theextended likelihood function. The functionL(x;λ,a)can


be used as before to estimate parameter values or obtain confidence intervals.


Two distinct cases arise in the use of the extended likelihood function, depending


on whether the Poisson parameterλis a function of the parametersaor is an


independent parameter.


Let us first consider the case in whichλis a function of the parametersa.From

(31.88), we can write the extended log-likelihood function as


lnL=Nlnλ(a)−λ(a)+

∑N

i=1

lnP(xi|a)=−λ(a)+

∑N

i=1

ln[λ(a)P(xi|a)].

where we have ignored terms not depending ona. The ML estimatesaˆof the


parameters can then be found in the usual way, and the ML estimate of the


Poisson parameter is simplyˆλ=λ(ˆa). The errors on our estimatorsaˆwill be, in


general, smaller than those obtained in the usual likelihood approach, since our


estimate includes information from the value ofNas well as the sample valuesxi.

Free download pdf