Advanced High-School Mathematics

(Tina Meador) #1

370 CHAPTER 6 Inferential Statistics


Convolution and the sum of independent random variables. Assume that
XandY are independent random variables with density functionsfX
andfY, respectively. We shall determine the distribution ofX+Y in
terms offX andfY.


To do this we observe that

fX+Y(t) =

d
dt

P(X+Y ≤t)

=

d
dt

P(Y ≤t−X)

=

d
dt

∫∞
−∞

∫t−x
−∞ fX(x)fY(y)dy dx
=

∫∞
−∞fX(x)fY(t−x)dx.
The last expression above is called theconvolutionof the density
functions.^19 We write this more simply as


fX+Y(t) =fX∗fY(t),

where for any real-valued^20 functionsfandg, the convolution is defined
by setting


f∗g(t) =

∫∞

−∞

f(x)g(t−x)dx.

From the above we can easily compute the distribution of the differ-
enceX−Y of the independent random variablesX andY. Note first
that the distribution of−Y is clearly the functionf−Y(t) =fY(−t), t∈
R. This implies that the distribution offX−Y is given by


fX−Y(t) =fX∗f−Y(t) =


∫∞

−∞

fX(x)f−Y(t−x)dx=

∫∞

−∞

fX(x)fY(x−t)dx.

(^19) Of course, the notion ofconvolutionwas already introduced in Exercise 5 on page 261.
(^20) Actually, there are additional hypotheses required to guarantee the existence of the convolution
product.

Free download pdf