PROBABILITY
the variablesXandY. For example, ifXandYare continuous RVs then the
expectation value ofZis given by
E[Z]=
∫
zp(z)dz=
∫∫
Z(x, y)f(x, y)dx dy. (30.65)
An analogous result exists for discrete random variables.
Integrals of the form (30.65) are often difficult to evaluate. Nevertheless, we
may use (30.65) to derive an important general result concerning expectation
values. IfXandYareanytwo random variables andaandbare arbitrary
constants then by lettingZ=aX+bYwe find
E[aX+bY]=aE[X]+bE[Y].
Furthermore, we may use this result to obtain anapproximateexpression for the
expectation valueE[Z(X, Y)] of any arbitrary function ofXandY. LettingμX=
E[X]andμY=E[Y], and providedZ(X, Y) can be reasonably approximated by
the linear terms of its Taylor expansion about the point (μX,μY), we have
Z(X, Y)≈Z(μX,μY)+
(
∂Z
∂X
)
(X−μX)+
(
∂Z
∂Y
)
(Y−μY),
(30.66)
where the partial derivatives are evaluated atX=μXandY=μY. Taking the
expectation values of both sides, we find
E[Z(X, Y)]≈Z(μX,μY)+
(
∂Z
∂X
)
(E[X]−μX)+
(
∂Z
∂Y
)
(E[Y]−μY)=Z(μX,μY),
which gives the approximate resultE[Z(X, Y)]≈Z(μX,μY).
By analogy with (30.65), the variance ofZ=Z(X, Y) is given by
V[Z]=
∫
(z−μZ)^2 p(z)dz=
∫∫
[Z(x, y)−μZ]^2 f(x, y)dx dy,
(30.67)
whereμZ=E[Z]. We may use this expression to derive a second useful result. If
XandYare twoindependentrandom variables, so thatf(x, y)=g(x)h(y), and
a,bandcare constants then by settingZ=aX+bY+cin (30.67) we obtain
V[aX+bY+c]=a^2 V[X]+b^2 V[Y]. (30.68)
From (30.68) we also obtain the important special case
V[X+Y]=V[X−Y]=V[X]+V[Y].
ProvidedXandYare indeed independent random variables, we may obtain
an approximate expression forV[Z(X, Y)], for any arbitrary functionZ(X, Y),
in a similar manner to that used in approximatingE[Z(X, Y)] above. Taking the