Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
226 Some Elementary Statistical Inferences

In this process, our information about the unknown distribution ofXor the
unknown parameters of the distribution ofXcomes from a sample onX.The
sample observations have the same distribution asX, and we denote them as the
random variablesX 1 ,X 2 ,...,Xn,wherendenotes thesample size. When the
sample is actually drawn, we use lower case lettersx 1 ,x 2 ,...,xn as the values
orrealizationsof the sample. Often we assume that the sample observations
X 1 ,X 2 ,...,Xnare also mutually independent, in which case we call the sample a
random sample, which we now formally define:
Definition 4.1.1.If the random variables X 1 ,X 2 ,...,Xn are independent and
identically distributed (iid), then these random variables constitute arandom sam-
pleof sizenfrom the common distribution.


Often, functions of the sample are used to summarize the information in a sam-
ple. These are called statistics, which we define as:
Definition 4.1.2.LetX 1 ,X 2 ,...,Xndenote a sample on a random variableX.Let
T=T(X 1 ,X 2 ,...,Xn)be a function of the sample. ThenTis called astatistic.


Once the sample is drawn, thentis called the realization ofT,wheret=
T(x 1 ,x 2 ,...,xn)andx 1 ,x 2 ,...,xnis the realization of the sample.

4.1.1 PointEstimators.........................


Using the above terminology, the problem we discuss in this chapter is phrased as:
LetX 1 ,X 2 ,...,Xndenote a random sample on a random variableXwith a density
or mass function of the formf(x;θ)orp(x;θ), whereθ∈Ω for a specified set Ω. In
this situation, it makes sense to consider a statisticT,whichisanestimatorofθ.
More formally,Tis called apoint estimatorofθ. While we callTan estimator
ofθ, we call its realizationtanestimateofθ.
There are several properties of point estimators that we discuss in this book.
We begin with a simple one, unbiasedness.


Definition 4.1.3(Unbiasedness).LetX 1 ,X 2 ,...,Xndenote a sample on a random
variableXwith pdff(x;θ),θ∈Ω.LetT=T(X 1 ,X 2 ,...,Xn)be a statistic. We
say thatTis anunbiasedestimator ofθifE(T)=θ.


In Chapters 6 and 7, we discuss several theories of estimation in general. The
purpose of this chapter, though, is an introduction to inference, so we briefly discuss
themaximum likelihood estimator (mle)and then use it to obtain point esti-
mators for some of the examples cited above. We expand on this theory in Chapter



  1. Our discussion is for the continuous case. For the discrete case, simply replace
    the pdf with the pmf.
    In our problem, the information in the sample and the parameterθare involved
    in the joint distribution of the random sample; i.e.,


∏n
i=1f(xi;θ). We want to view
this as a function ofθ,sowewriteitas

L(θ)=L(θ;x 1 ,x 2 ,...,xn)=

∏n

i=1

f(xi;θ). (4.1.1)
Free download pdf