The Essentials of Biostatistics for Physicians, Nurses, and Clinicians

(Ann) #1
7.5 Multiple Regression 111

good resolution to the problem. Also the Bush campaign said that for
every county that Gore contests, Bush could also fi nd counties that he
would contest. Perhaps the Supreme Court ’ s decision was correct. If
there is no good way to correct a mistake you should stay with the
results you have. It was the right decision, but their reasoning was
wrong.

7.5 MULTIPLE REGRESSION


The differences between simple linear regression and multiple linear
regression are


  1. One independent predictor variable versus two or more indepen-
    dent predictor variables

  2. The bivariate correlation squared is replaced by the multiple
    correlation coeffi cient R 2.

  3. In multiple linear regression, the correlation matrix replaces the
    correlation coeffi cient.

  4. Partial correlations can be defi ned in multiple regression.


Recall that as mentioned in the section on simple linear regression,
the form of multiple regression that we are referring to in this section
is linear regression, which involves an equation that is linear in the
parameters and not necessarily the independent variables. Multiple
nonlinear regression is not a topic for this text, but Gallant ( 1987 ) is
an excellent text that concentrates on nonlinear regression (both simple
and multiple).
Not written in the equation above is the additive independent error
term denoted by ε. This error term has mean 0 and a variance σ 2 that
is constant (does not change as the independent variables change).
Under these assumptions, the least squares estimates of the regression
parameters are minimum variance unbiased estimators. Also, if ε has a
normal distribution, the parameter estimates are maximum likelihood
estimates. This property also holds for simple linear regression. The
property that the least squares estimates are minimum variance among
unbiased estimates is called the Gauss – Markov theorem. A proof can

Free download pdf