336 CHAPTER 6 Inferential Statistics
x^2
d
2
dx^2
(x+ 1)n
(x+ 1)N−n = n(n−1)x^2 (x+ 1)N−^2
= x^2
n(n−1)
N(N−1)
d^2
dx^2
(x+ 1)N.
Next comes the hard part (especially the first equality):
∑N
k=0
(∑n
m=0
m(m−1)
(
n
m
)(
N−n
k−m
))
=
∑n
m=0
(m−1)
(
n
m
)
xm·
N∑−n
p=0
(
N−n
p
)
xp
=
ñ
x^2 d
2
dx^2 (x+ 1)
n
ô
(x+ 1)N−n
= x^2 Nn((nN−−1)1) d
2
dx^2 (x+ 1)
N
= Nn((nN−−1)1)
∑N
k=0
k(k−1)
(
N
k
)
xk.
Just as we did at a similar juncture when computingE(X), we equate
the coefficients ofxk, which yields the equality
∑n
m=0
m(m−1)
Ñ
n
m
éÑ
N−n
k−m
é
=
n(n−1)k(k−1)
N(N−1)
Ñ
N
k
é
.
The left-hand sum separates into two sums; solving for the first sum
gives
∑n
m=0
m^2
Ñ
n
m
éÑ
N−n
k−m
é
=
n(n−1)k(k−1)
N(N−1)
Ñ
N
k
é
+
∑n
m=0
m
Ñ
n
m
éÑ
N−n
k−m
é
=
n(n−1)k(k−1)
N(N−1)
Ñ
N
k
é
+
nk
N
Ñ
N
k
é
,
which implies that
E(X^2 ) =
n(n−1)k(k−1)
N(N−1)
+
nk
N
.
Finally, from this we obtain the variance of the hypergeometric distri-
bution: