Mathematical Methods for Physics and Engineering : A Comprehensive Guide

(Darren Dugan) #1

30.12 PROPERTIES OF JOINT DISTRIBUTIONS


More generally, we find (fora,bandcconstant)


V[aX+bY+c]=a^2 V[X]+b^2 V[Y]+2abCov[X, Y].
(30.136)

Note that ifXandYare in fact independent then Cov[X, Y] = 0 and we recover


the expression (30.68) in subsection 30.6.4.


We may use (30.136) to obtain an approximate expression forV[f(X, Y)]

for any arbitrary functionf, even when the random variablesXandY are


correlated. Approximatingf(X, Y) by the linear terms of its Taylor expansion


about the point (μX,μY), we have


f(X, Y)≈f(μX,μY)+

(
∂f
∂X

)
(X−μX)+

(
∂f
∂Y

)
(Y−μY),
(30.137)

where the partial derivatives are evaluated atX=μXandY=μY. Taking the


variance of both sides, and using (30.136), we find


V[f(X, Y)]≈

(
∂f
∂X

) 2
V[X]+

(
∂f
∂Y

) 2
V[Y]+2

(
∂f
∂X

)(
∂f
∂Y

)
Cov[X, Y].
(30.138)

Clearly, if Cov[X, Y] = 0, we recover the result (30.69) derived in subsection 30.6.4.


We note that (30.138) is exact iff(X, Y) is linear inXandY.


For several variablesXi,i=1, 2 ,...,n, we can define the symmetric (positive

definite)covariance matrixwhose elements are


Vij=Cov[Xi,Xj], (30.139)

and the symmetric (positive definite)correlation matrix


ρij=Corr[Xi,Xj].

The diagonal elements of the covariance matrix are the variances of the variables,


whilst those of the correlation matrix are unity. For several variables, (30.138)


generalises to


V[f(X 1 ,X 2 ,...,Xn)]≈


i

(
∂f
∂Xi

) 2
V[Xi]+


i


j=i

(
∂f
∂Xi

)(
∂f
∂Xj

)
Cov[Xi,Xj],

where the partial derivatives are evaluated atXi=μXi.

Free download pdf