366 CHAPTER 6 Inferential Statistics
Var(X) =
∫∞
−∞(x−μX)
(^2) fX(x)dx.
(though most texts gloss over this point).
The positive square rootσX of Var(X) is called thestandard devia-
tionofX.
6.3.1 Some theory
If we adapt the arguments beginning on page 319 to continuous random
variables, we can give a heuristic argument that the expectation of the
sum of two continuous random variables is the sum of the expectations.
The basic idea is that there is ajoint densityfunctionfXY which gives
probabilities such as
P(a≤X≤bandc≤Y ≤d) =
∫b
a
∫d
c fXY(x,y)dxdy.
These can be represented in terms of conditional probabilities in the
usual way: fXY(x,y) =fX(x|y)fY(y); furthermore, one has
fX(x) =
∫∞
−∞fX(x|y)dy.
Accepting all of this stuff, one proceeds exactly as on pages 319–320:
μX+Y =
∫∞
−∞
∫∞
−∞(x+y)fXY(x,y)dxdy
=
∫∞
−∞
∫∞
−∞xfXY(x,y)dxdx+
∫∞
−∞
∫∞
−∞yfXY(x,y)dxdy
=
∫∞
−∞
∫∞
−∞xfX(x|y)fY(y)dy dx+
∫∞
−∞
∫∞
−∞yfY(y|x)fX(x)dxdy
=
∫∞
−∞xfX(x)dx+
∫∞
−∞yfY(y)dy
= μX+μY.
A similar argument, together with mathematical induction can be
used to show that