STATISTICS
where in the last line we have used the fact that
∑
i(xi−x ̄) = 0. Hence, for given values
ofμandσ, the sampling distribution is in fact a function only of the sample mean ̄xand
the standard deviations. Thus the sampling distribution of ̄xandsmust satisfy
P( ̄x, s|μ, σ)d ̄xds=(2πσ^2 )−N/^2 exp
{
−
N[( ̄x−μ)^2 +s^2 ]
2 σ^2
}
dV , (31.34)
wheredV=dx 1 dx 2 ···dxNis an element of volume in the sample space which yields
simultaneously values ofx ̄andsthat lie within the region bounded by [ ̄x, ̄x+dx ̄]and
[s, s+ds]. Thus our only remaining task is to expressdVin terms of ̄xandsand their
differentials.
LetSbe the point in sample space representing the sample (x 1 ,x 2 ,...,xN). For given
values of ̄xands, we require the sample values to satisfy both the condition
∑
i
xi=N ̄x,
which defines an (N−1)-dimensional hyperplane in the sample space, and the condition
∑
i
(xi− ̄x)^2 =Ns^2 ,
which defines an (N−1)-dimensional hypersphere. ThusSis constrained to lie in the
intersection of these two hypersurfaces, which is itself an (N−2)-dimensional hypersphere.
Now, the volume of an (N−2)-dimensional hypersphere is proportional tosN−^1. It follows
that the volume√ dVbetween two concentric (N−2)-dimensional hyperspheres of radius
Nsand
√
N(s+ds)andtwo(N−1)-dimensional hyperplanes corresponding to ̄xand
̄x+d ̄xis
dV=AsN−^2 ds d ̄x,
whereAis some constant. Thus, substituting this expression fordVinto (31.34), we find
P(x, s ̄ |μ, σ)=C 1 exp
[
−
N( ̄x−μ)^2
2 σ^2
]
C 2 sN−^2 exp
(
−
Ns^2
2 σ^2
)
=P( ̄x|μ, σ)P(s|σ),
(31.35)
whereC 1 andC 2 are constants. We have writtenP( ̄x, s|μ, σ) in this form to show that it
separates naturally into two parts, one depending only on ̄xand the other only ons. Thus,
̄xandsareindependentvariables. Separate normalisations of the two factors in (31.35)
require
C 1 =
(
N
2 πσ^2
) 1 / 2
and C 2 =2
(
N
2 σ^2
)(N−1)/ 2
1
Γ
( 1
2 (N−1)
),
where the calculation ofC 2 requires the use of the gamma function, discussed in the
Appendix.
Themarginalsampling distribution of any one of the estimatorsˆaiis given
simply by
P(ˆai|a)=
∫
···
∫
P(aˆ|a)daˆ 1 ···daˆi− 1 dˆai+1···daˆM,
and the expectation valueE[aˆi] and varianceV[aˆi]ofaˆiare again given by (31.14)
and (31.16) respectively. By analogy with the one-dimensional case, the standard
errorσˆaion the estimatoraˆiis given by the positive square root ofV[aˆi]. With