104 Chapter 4:Random Variables and Expectation
In particular, for anynsets of real numbersA 1 ,A 2 ,...,An
P{X 1 ∈A 1 ,X 2 ∈A 2 ,...,Xn∈An}=∫An∫An− 1...∫A 1f(x 1 ,...,xn)dx 1 dx 2 ...dxnThe concept of independence may, of course, also be defined for more than two random
variables. In general, thenrandom variablesX 1 ,X 2 ,...,Xnare said to be independent if,
for all sets of real numbersA 1 ,A 2 ,...,An,
P{X 1 ∈A 1 ,X 2 ∈A 2 ,...,Xn∈An}=∏ni= 1P{Xi∈Ai}As before, it can be shown that this condition is equivalent to
P{X 1 ≤a 1 ,X 2 ≤a 2 ,...,Xn≤an}=∏ni= 1P{X 1 ≤ai} for alla 1 ,a 2 ,...,anFinally, we say that an infinite collection of random variables is independent if every finite
subcollection of them is independent.
EXAMPLE 4.3e Suppose that the successive daily changes of the price of a given stock are
assumed to be independent and identically distributed random variables with probability
mass function given by
P{daily change isi}=
−3 with probability .05
−2 with probability .10
−1 with probability .20
0 with probability .30
1 with probability .20
2 with probability .10
3 with probability .05Then the probability that the stock’s price will increase successively by 1, 2, and 0 points
in the next three days is
P{X 1 =1,X 2 =2,X 3 = 0 }=(.20)(.10)(.30)=.006where we have letXidenote the change on theith day. ■