Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
408 Maximum Likelihood Methods

Solving∂Q/∂θ=0forθdetermines the EM step estimates. In particular, given
thatθ̂(m)is the EM estimate on themth step, the (m+ 1)st step estimate is

θ̂(m+1)=n^1
n

x+

n 2
n

̂θ(m)+n^2
n

φ(a−θ̂(m))
1 −Φ(a−θ̂(m))

, (6.6.16)

wheren=n 1 +n 2.


For our second example, consider a mixture problem involving normal distribu-
tions. SupposeY 1 has aN(μ 1 ,σ^21 ) distribution andY 2 has aN(μ 2 ,σ 22 ) distribu-
tion. LetW be a Bernoulli random variable independent ofY 1 andY 2 and with
probability of success =P(W = 1). Suppose the random variable we observe
isX =(1−W)Y 1 +WY 2. In this case, the vector of parameters is given by
θ′ =(μ 1 ,μ 2 ,σ 1 ,σ 2 , ). As shown in Section 3.4, the pdf of the mixture random
variableXis
f(x)=(1− )f 1 (x)+ f 2 (x), −∞<x<∞, (6.6.17)


wherefj(x)=σj−^1 φ[(x−μj)/σj],j=1,2, andφ(z) is the pdf of a standard normal
random variable. Suppose we observe a random sampleX′=(X 1 ,X 2 ,...,Xn)from
this mixture distribution with pdff(x). Then the log of the likelihood function is


l(θ|x)=

∑n

i=1

log[(1− )f 1 (xi)+f 2 (xi)]. (6.6.18)

In this mixture problem, the unobserved data are the random variables that
identify the distribution membership. Fori=1, 2 ,...,n, define the random vari-
ables
Wi=


{
0ifXihas pdff 1 (x)
1ifXihas pdff 2 (x).
These variables, of course, constitute the random sample on the Bernoulli random
variableW. Accordingly, assume thatW 1 ,W 2 ,...,Wnare iid Bernoulli random
variables with probability of success. The complete likelihood function is


Lc(θ|x,w)=


Wi=0

f 1 (xi)


Wi=1

f 2 (xi).

Hence the log of the complete likelihood function is


lc(θ|x,w)=


Wi=0

logf 1 (xi)+


Wi=1

logf 2 (xi)

=

∑n

i=1

[(1−wi)logf 1 (xi)+wilogf 2 (xi)]. (6.6.19)

For the E step of the algorithm, we need the conditional expectation ofWigivenx
underθ 0 ;thatis,
Eθ 0 [Wi|θ 0 ,x]=P[Wi=1|θ 0 ,x].

Free download pdf