Robert_V._Hogg,_Joseph_W._McKean,_Allen_T._Craig

(Jacob Rumans) #1
2.5. The Correlation Coefficient 125

2.4.6.Iff(x 1 ,x 2 )=e−x^1 −x^2 , 0 <x 1 <∞, 0 <x 2 <∞, zero elsewhere, is the
joint pdf of the random variablesX 1 andX 2 , show thatX 1 andX 2 are independent
and thatM(t 1 ,t 2 )=(1−t 1 )−^1 (1−t 2 )−^1 ,t 2 < 1 ,t 1 <1. Also show that


E(et(X^1 +X^2 ))=(1−t)−^2 ,t< 1.

Accordingly, find the mean and the variance ofY=X 1 +X 2.


2.4.7.Let the random variablesX 1 andX 2 have the joint pdff(x 1 ,x 2 )=1/π,for
(x 1 −1)^2 +(x 2 +2)^2 <1, zero elsewhere. Findf 1 (x 1 )andf 2 (x 2 ). AreX 1 andX 2
independent?


2.4.8.LetXandYhave the joint pdff(x, y)=3x, 0 <y<x<1, zero elsewhere.
AreXandY independent? If not, findE(X|y).


2.4.9. Suppose that a man leaves for work between 8:00 a.m. and 8:30 a.m. and
takes between 40 and 50 minutes to get to the office. LetXdenote the time of
departure and letY denote the time of travel. If we assume that these random
variables are independent and uniformly distributed, find the probability that he
arrives at the office before 9:00 a.m.


2.4.10.LetXandY be random variables with the space consisting of the four
points (0,0),(1,1),(1,0),(1,−1). Assign positive probabilities to these four points
so that the correlation coefficient is equal to zero. AreXandYindependent?


2.4.11.Two line segments, each of length two units, are placed along thex-axis.
The midpoint of the first is betweenx=0andx= 14 and that of the second is
betweenx=6andx= 20. Assuming independence and uniform distributions for
these midpoints, find the probability that the line segments overlap.


2.4.12.Cast a fair die and letX=0if1, 2 ,or 3 spots appear, letX=1if4or5
spots appear, and letX = 2 if 6 spots appear. Do this two independent times,
obtainingX 1 andX 2 .CalculateP(|X 1 −X 2 |=1).


2.4.13.ForX 1 andX 2 in Example 2.4.6, show that the mgf ofY =X 1 +X 2 is
e^2 t/(2−et)^2 ,t<log 2, and then compute the mean and variance ofY.


2.5 TheCorrelationCoefficient


Let (X, Y) denote a random vector. In the last section, we discussed the concept
of independence betweenX andY. What if, though, XandY are dependent
and, if so, how are they related? There are many measures of dependence. In
this section, we introduce a parameterρof the joint distribution of (X, Y)which
measures linearity betweenXandY. In this section, we assume the existence of
all expectations under discussion.


Definition 2.5.1. Let(X, Y)have a joint distribution. Denote the means ofX
andY respectively byμ 1 andμ 2 and their respective variances byσ^21 andσ 22 .The
covarianceof(X, Y)is denoted by cov(X, Y)and is defined by the expectation


cov(X, Y)=E[(X−μ 1 )(Y−μ 2 )]. (2.5.1)
Free download pdf