Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
228 Some Elementary Statistical Inferences

realized values are:
359 413 25 130 90 50 50 487 102 194 55 74 97
For instance, 359 hours is the realization of the random variableX 1. The data
range from 25 to 487 hours. Assuming an exponential model, the point estimate of
θdiscussed above is the arithmetic average of this data. Assuming that the data
set is stored in the R vectorophrs, this average is computed in R by
mean(ophrs); 163.5385
Hence our point estimate ofθ, the mean ofX, is 163.54 hours. How close is 163.54
hours to the trueθ? We provide an answer to this question in the next section.


Example 4.1.2(Binomial Distribution).LetXbe one or zero if, respectively, the
outcome of a Bernoulli experiment is success or failure. Letθ,0<θ<1, denote
the probability of success. Then by (3.1.1), the pmf ofXis

p(x;θ)=θx(1−θ)^1 −x,x=0or1.

IfX 1 ,X 2 ,...,Xnis a random sample onX, then the likelihood function is

L(θ)=

∏n

i=1

p(xi;θ)=θ

Pn
i=1xi(1−θ)n−

Pn
i=1xi,xi=0or1.

Taking logs, we have

l(θ)=

∑n

i=1

xilogθ+

(
n−

∑n

i=1

xi

)
log(1−θ),xi=0or1.

The partial derivative ofl(θ)is

∂l(θ)
∂θ

=

∑n
i=1xi
θ


n−

∑n
i=1xi
1 −θ

.

Setting this to 0 and solving forθ,weobtainθ̂=n−^1

∑n
i=1Xi=X; i.e., the mle
is the proportion of successes in thentrials. BecauseE(X)=θ,̂θis an unbiased
estimator ofθ.
Devore (2012) discusses a study involving ceramic hip replacements which for
some patients can be squeaky; see, also, page 30 of Kloke and McKean (2014).
In this study, 28 out of 143 hip replacements squeaked. In terms of the above
discussion, we have a realization of a sample of size n= 143 from a binomial
distribution where success is a hip replacement that squeaks and failure is one that
does not squeak. Letθdenote the probability of success. Then our estimate ofθ
based on this sample isθ̂=28/143 = 0.1958. This is straightforward to calculate
but, for later use, the R codeprop.test(28,143)calculates this proportion.


Example 4.1.3(Normal Distribution). LetXhave aN(μ, σ^2 ) distribution with
the pdf given in expression (3.4.6). In this case,θis the vectorθ=(μ, σ). If
X 1 ,X 2 ,...,Xnis a random sample onX, then the log of the likelihood function
simplifies to


l(μ, σ)=−
n
2

log 2π−nlogσ−
1
2

∑n

i=1

(
xi−μ
σ

) 2

. (4.1.4)

Free download pdf