The Mathematics of Financial Modelingand Investment Management

(Brent) #1

6-ConceptsProbability Page 197 Wednesday, February 4, 2004 3:00 PM


Concepts of Probability 197

THE REGRESSION FUNCTION


Given a probability space (Ω,ℑ,P), consider a set of p +1 random variables.
Let’s suppose that the random vector {XZ 1 ... Zp} ≡{XZ}, Z = {Z 1 ... Zp}
has the joint multivariate probability density function:

fxz( 1 ...zp)= fx( ,z), z = {z 1 ...zp}

Let’s consider the conditional density

fx( z 1 , ...,zp) = fx( ,z)

and the marginal density of Z,


fz()z = ∫ fx( ,z)dx




Recall from an earlier section that the joint multivariate density f(x,z)
factorizes as

fx( ,z) = fx( z)fz ()z

Let’s consider now the conditional expectation of the variable Xgiven Z
= z = {z 1 ... zp}:


g()z = EX[ Z = z] = ∫ vf v( z)dv




The function g, that is, the function which gives the conditional expec-
tation of Xgiven the variables Z, is called the regression function. Oth-
erwise stated, the regression function is a real function of real variables
which is the locus of the expectation of the random variable Xgiven
that the variables Z assume the values z.

Linear Regression
In general, the regression function depends on the joint distribution of
[XZ 1 ... Zp]. In financial econometrics it is important to determine
what joint distributions produce a linear regression function. It can be
Free download pdf