320 CHAPTER 6 Inferential Statistics
P(Y =y) =∑∞
j=1P(Y =y|X=xi)P(X=xi). (6.2)Having noted this, we now proceed:
μX+Y =∑∞
i=1∑∞
j=1(xi+yj)P(X=xi andY =yj)=
∑∞
i=1∑∞
j=1xiP(X=xi andY =yj)+
∑∞
i=1∑∞
j=1yjP(X=xiandY =yj)=
∑∞
i=1∑∞
j=1xiP(X=xi|Y =yj)P(Y =yj)+
∑∞
i=1∑∞
j=1yjP(Y =yj|X=xi)P(X=xi)=
∑∞
i=1xi∑∞
j=1P(X=xi|Y =yj)P(Y =yj)+
∑∞
j=1yj∑∞
i=1P(Y =yj|X=xi)P(X=xi)=
∑∞
i=1xiP(X=xi) +∑∞
j=1yiP(Y =yi) by (6.1) and (6.2)= μX+μY,proving that
E(X+Y) = E(X) +E(Y). (6.3)
Next, note that ifXis a random variable and ifaandbare constants,
then it’s clear that E(aX) =aE(X); from this we immediately infer
(sincebcan be regarded itself as a random variable with meanb) that
E(aX+b) = aE(X) +b.