11.1 Introduction 341
Qualitatively it represents the centroid or the average value of the PDF and is therefore often
simply called the expectation value ofp.^1 A PDF can in principle be expanded in the set of its
moments [66]. For two PDFs to be equal, each of their moments must be equal.
A special version of the moments is the set ofcentral moments, the n-th central moment
defined as
〈(x−〈x〉)n〉≡
∫
(x−〈x〉)np(x)dx
The zero-th and first central moments are both trivial, equal 1 and 0 , respectively. But the sec-
ond central moment, known as thevarianceofp, is of particular interest. For the stochastic
variableX, the variance is denoted asσX^2 orVar(X)
σX^2 = Var(X) =〈(x−〈x〉)^2 〉=
∫
(x−〈x〉)^2 p(x)dx
=
∫(
x^2 − 2 x〈x〉 +〈x〉^2
)
p(x)dx
=〈x^2 〉− 2 〈x〉〈x〉+〈x〉^2
=〈x^2 〉−〈x〉^2
The square root of the variance,σ=
√
〈(x−〈x〉)^2 〉is called thestandard deviationofp. It
is clearly just the RMS (root-mean-square) value of the deviation of the PDF from its mean
value, interpreted qualitatively as the “spread” ofparound its mean.
We will also be interested in finding the PDF of afunctionof a stochastic variable. Let
the stochastic variableXhave the PDFpX(x), and letY=h(X)be a function ofX. What we
want to find is the PDF ofY,pY(y). We will have to restrict ourselves to the case where
h(X)is invertible, so that it has to be strictly monotonous. First we construct the cumulative
distribution ofY, considering only the case wherehincreases
PY(y) =Prob(Y≤y) =Prob(h(X)≤y) =Prob(X≤h−^1 (y)) =PX(h−^1 (y))
whereh−^1 is the inverse function ofh, meaning that ify=h(x)thenx=h−^1 (y). This gives the
PDF ofY
pY(y) =
d
dy
PY(y) =
d
dy
PX(h−^1 (y))
Considering in a similar manner the other case of a decreasinghwe arrive at
pY(y) =pX(h−^1 (y))
∣∣
∣∣d
dy
h−^1 (y)
∣∣
∣∣ (11.3)
This formula will become useful when transforming simple pseudo random number genera-
tors to more general ones.
All the PDFs above have been written as functions of only one stochastic variable. Such
PDFs are calledunivariate. A PDF may well consist of any number of variables, in which case
we call itmultivariate. A general multivariate expectation value is defined similarly as for the
univariate case, but all stochastic variables are taken into account at once. LetP(x 1 ,...,xn)
be the multivariate PDF for the set{Xi}ofnstochastic variables and letH(x 1 ,...,xn)be an
arbitrary function over the joint domain of allXi. The expectation value ofHwith respect to
Pis defined as follows
(^1) We should now formulate 11.2 in a more rigorous manner. It is mathematically more correct to speak ofh
as a function transforming the stochastic variableXto the stochastic variableY,Y=h(X). LetpX(x)be the
known PDF ofX, andpY(y)be the unknown PDF ofY. It can then be shown [66] that the expectation value of
Y, namely〈y〉Y=∫ypY(y)dy, must equal what we have defined as the expectation value ofh(x)with respect to
pX, namely〈h〉X=∫h(x)pX(x)dx.