Mathematical Methods for Physics and Engineering : A Comprehensive Guide

(Darren Dugan) #1

PROBABILITY


Finally we note that, by analogy with the single-variable case, the characteristic


function and the cumulant generating function of a multivariate distribution are


defined respectively as


C(t 1 ,t 2 ,...,tn)=M(it 1 ,it 2 ,...,itn)andK(t 1 ,t 2 ,...,tn)=lnM(t 1 ,t 2 ,...,tn).


Suppose that the random variablesXi,i=1, 2 ,...,n,aredescribedbythePDF
f(x)=f(x 1 ,x 2 ,...,xn)=Nexp(−^12 xTAx),

where the column vectorx=(x 1 x 2 ··· xn)T,Ais ann×nsymmetric matrix andN
is a normalisation constant such that


f(x)dnx≡

∫∞


−∞

∫∞


−∞

···


∫∞


−∞

f(x 1 ,x 2 ,...,xn)dx 1 dx 2 ···dxn=1.

Find the MGF off(x).

From (30.142), the MGF is given by


M(t 1 ,t 2 ,...,tn)=N



exp(−^12 xTAx+tTx)dnx, (30.144)

where the column vectort=(t 1 t 2 ··· tn)T. In order to evaluatethis multiple integral,
we begin by noting that


xTAx− 2 tTx=(x−A−^1 t)TA(x−A−^1 t)−tTA−^1 t,

which is the matrix equivalent of ‘completing the square’. Using this expression in (30.144)
and making the substitutiony=x−A−^1 t,weobtain


M(t 1 ,t 2 ,...,tn)=cexp(^12 tTA−^1 t), (30.145)

where the constantcis given by


c=N



exp(−^12 yTAy)dny.

From the normalisation condition forN,weseethatc= 1, as indeed it must be in order
thatM(0, 0 ,...,0) = 1.


30.14 Transformation of variables in joint distributions

Suppose the random variablesXi,i=1, 2 ,...,n, are described by the multivariate


PDFf(x 1 ,x 2 ...,xn). If we wish to consider random variablesYj,j=1, 2 ,...,m,


related to theXibyYj=Yj(X 1 ,X 2 ,...,Xm) then we may calculateg(y 1 ,y 2 ,...,ym),


the PDF for theYj, in a similar way to that in the univariate case by demanding


that


|f(x 1 ,x 2 ...,xn)dx 1 dx 2 ···dxn|=|g(y 1 ,y 2 ,...,ym)dy 1 dy 2 ···dym|.

From the discussion of changing the variables in multiple integrals given in

chapter 6 it follows that, in the special case wheren=m,


g(y 1 ,y 2 ,...,ym)=f(x 1 ,x 2 ...,xn)|J|,
Free download pdf