9.7. Regression 557
combination of these, or some other function. There are different techniques available
to handle the nonlinear regression problems but the two most practical and common
ones are least squares regression and maximum likelihood regression. The maximum
likelihood function has already been discussed earlier in the chapter and therefore
will not be repeated here. One point that is worth noting here is that, although
maximum likelihood regression is a very fine technique but in practice it is not very
commonly used since it is computationally more involved than the least squares
regression technique.
The basic idea behind least squares regression for nonlinear fitting is the same
as we discussed for the simple linear regression. That is, one tries to minimize the
sum of the squared residuals of the dataset. The problem in this case, however, is
that due, to the nonlinear nature of the function, analytic forms for the coefficients
can not be generally found. One then resorts to numerical techniques to solve the
equations. These techniques normally solve the equations recursively, a process that
may or may not converge to an acceptable solution. Even with this shortcoming,
the least squares method is sill the most widely used technique for nonlinear as well
as linear regression.
Since exact form of nonlinear regression equations depend on the type of function
one is trying to fit, therefore it is not worthwhile to go into specific function details.
However, to give the reader a general overview of the technique, we will have a look
at its functional form. Suppose we want to fit a nonlinear function of the form
f(x, α) to the data. Herexare the independent variables andαrepresents the
coefficients that need to be determined. For example, the function may be a simple
second order polynomial given by
f(x, α)=α 1 +α 2 x+α 3 x^2. (9.7.10)
Following the procedure described for the case of linear regression, we define the
sum of the squared residuals as
χ^2 =
∑
[f(xi,α)−yi]^2 , (9.7.11)
whereyiisthedataatxiandf(xi,α) represents the value of the function atxi.To
minimizeχ^2 we differentiate it with respect to all theα’s and equate the result to
zero, that is
∂χ^2
∂α
=0=
∂
∂α
[∑
{f(xi,α)−yi}^2
]
. (9.7.12)
It is obvious that the functionf(x, α) can be of any type, which in fact is the
strongest point of the least squares method. One can essentially fit any function
provided numerical techniques can be developed to solve the resulting equations.
Example:
Determine the least squares equations to determine the coefficients of a
second order polynomial.
Solution:
A second order polynomial can be written as
f(x, α)=α 1 +α 2 x+α 3 x^2.