Anon

(Dana P.) #1

Building and Testing a Multiple Linear Regression Model 83


Just to pick one possibility of ambiguity, the same effect is achieved by
either increasing β 1 by, for example, 0.25 or by increasing β 3 by 1, and so
forth. In this example, the rank would just be 1. This is also intuitive since,
generally, the rank of (XTX)–1 indicates the number of truly independent
sources.^1


Procedures for Mitigating Multicollinearity


While it is quite impossible to provide a general rule to eliminate the prob-
lem of multicollinearity, there are some techniques that can be employed to
mitigate the problem.
Multicollinearity might be present if there appears to be a mismatch
between the sign of the correlation coefficient and the regression coefficient
of that particular independent variable. So, the first place to always check is
the correlation coefficient for each independent variable and the dependent
variable.
Three other indicators of multicollinearity are:



  1. The sensitivity of regression coefficients to the inclusion of additional
    independent variables.

  2. Changes from significance to insignificance of already included inde-
    pendent variables after new ones have been added.

  3. An increase in the model’s standard error of the regression.


A consequence of the above is that the regression coefficient estimates vary
dramatically as a result of only minor changes in the data X.
A remedy most commonly suggested is to try to single out independent
variables that are likely to cause the problems. This can be done by exclud-
ing those independent variables so identified from the regression model. It
may be possible to include other independent variables, instead, that pro-
vide additional information.
In general, due to multicollinearity, the standard error of the regression
increases, rendering the t-ratios of many independent variables too small
to indicate significance despite the fact that the regression model, itself is
highly significant.
To find out whether the variance error of the regession is too large, we
present a commonly employed tool. We measure multicollinearity by comput-
ing the impact of the correlation between some independent variables and the


(^1) One speaks of “near collinearity” when the determinant of XTX is very small so
that matrix inversion is unstable and the estimation of the regression parameters is
unstable.

Free download pdf