230 Some Elementary Statistical Inferences
Example 4.1.4(Uniform Distribution). LetX 1 ,...,Xnbe iid with the uniform
(0,θ) density; i.e.,f(x)=1/θfor 0<x<θ, 0 elsewhere. Becauseθis in the
support, differentiation is not helpful here. The likelihood function can be written
as
L(θ)=θ−nI(max{xi},θ),
whereI(a, b)is1or0ifa≤bora>b, respectively. The functionL(θ)isa
decreasing function ofθfor allθ≥max{xi}and is 0 otherwise [sketch the graph
ofL(θ)]. So the maximum occurs at the smallest value thatθcan assume; i.e., the
mle isθ̂=max{Xi}.
4.1.2 HistogramEstimatesofpmfsandpdfs.............
LetX 1 ,...,Xnbe a random sample on a random variableXwith cdfF(x). In this
section, we briefly discuss a histogram of the sample, which is an estimate of the pmf,
p(x), or the pdf,f(x), ofXdepending on whetherXis discrete or continuous. Other
thanXbeing a discrete or continuous random variable, we make no assumptions
on the form of the distribution ofX. In particular, we do not assume a parametric
form of the distribution as we did for the above discussion on maximum likelihood
estimates; hence, the histogram that we present is often called anonparametric
estimator. See Chapter 10 for a general discussion of nonparametric inference. We
discuss the discrete situation first.
The Distribution ofXIs Discrete
Assume thatX is a discrete random variable with pmfp(x). LetX 1 ,...,Xnbe
a random sample onX. First, suppose that the space ofX is finite, say,D=
{a 1 ,...,am}. An intuitive estimate ofp(aj) is the relative frequency ofajin the
sample. We express this more formally as follows. Forj=1, 2 ,...,m, define the
statistics
Ij(Xi)=
{
1 Xi=aj
0 Xi =aj.
Then our intuitive estimate ofp(aj) can be expressed by the sample average
p̂(aj)=
1
n
∑n
i=1
Ij(Xi). (4.1.10)
These estimators{̂p(a 1 ),...,p̂(am)}constitute the nonparametric estimate of the
pmfp(x). Note thatIj(Xi) has a Bernoulli distribution with probability of success
p(aj). Because
E[p̂(aj)] =
1
n
∑n
i=1
E[Ij(Xi)] =
1
n
∑n
i=1
p(aj)=p(aj), (4.1.11)
̂p(aj) is an unbiased estimator ofp(aj).