Introduction to Probability and Statistics for Engineers and Scientists

(Sean Pound) #1

4.7Covariance and Variance of Sums of Random Variables 121


That is, the variance of a constant plus a random variable is equal to the variance of the
random variable. (Is this intuitive? Think about it.) Finally, settingb=0 yields


Var(aX)=a^2 Var(X)

The quantity


Var(X) is called thestandard deviationofX. The standard deviation
has the same units as does the mean.


REMARK


Analogous to the mean’s being the center of gravity of a distribution of mass, the variance
represents, in the terminology of mechanics, the moment of inertia.


4.7Covariance and Variance of Sums of Random Variables


We showed in Section 4.5 that the expectation of a sum of random variables is equal to
the sum of their expectations. The corresponding result for variances is, however, not
generally valid. Consider


Var(X+X)=Var(2X)

= 22 Var(X)
=4 Var(X)
=Var(X)+Var(X)

There is, however, an important case in which the variance of a sum of random vari-
ables is equal to the sum of the variances; and this is when the random variables are
independent. Before proving this, however, let us define the concept of the covariance of
two random variables.


Definition

Thecovarianceof two random variablesXandY, written Cov(X,Y) is defined by


Cov(X,Y)=E[(X−μx)(Y−μy)]

whereμxandμyare the means ofXandY, respectively.


A useful expression for Cov(X,Y) can be obtained by expanding the right side of the
definition. This yields


Cov(X,Y)=E[XY−μxY−μyX+μxμy]
=E[XY]−μxE[Y]−μyE[X]+μxμy
Free download pdf