f5 = a[ 5 ] * np.sin(x)
f4 = a[ 4 ] * y ** 2
f3 = a[ 3 ] * x ** 2
f2 = a[ 2 ] * y
f1 = a[ 1 ] * x
f0 = a[ 0 ] * 1
return (f6 + f5 + f4 + f3 +
f2 + f1 + f0)
These values can then be compared with the original shape of the example function, as
shown in Figure 9-10:
In [ 39 ]: RZ = reg_func(a, (X, Y))
In [ 40 ]: fig = plt.figure(figsize=( 9 , 6 ))
ax = fig.gca(projection=‘3d’)
surf1 = ax.plot_surface(X, Y, Z, rstride= 2 , cstride= 2 ,
cmap=mpl.cm.coolwarm, linewidth=0.5,
antialiased=True)
surf2 = ax.plot_wireframe(X, Y, RZ, rstride= 2 , cstride= 2 ,
label=‘regression’)
ax.set_xlabel(‘x’)
ax.set_ylabel(‘y’)
ax.set_zlabel(‘f(x, y)’)
ax.legend()
fig.colorbar(surf, shrink=0.5, aspect= 5 )
Figure 9-10. Higher-dimension regression
REGRESSION
Least-squares regression approaches have multiple areas of application, including simple function approximation
and function approximation based on noisy or unsorted data. These approaches can be applied to single as well as
multidimensional problems. Due to the underlying mathematics, the application is always “almost the same.”
Interpolation
Compared to regression, interpolation (e.g., with cubic splines), is much more involved
mathematically. It is also limited to low-dimensional problems. Given an ordered set of
observation points (ordered in the x dimension), the basic idea is to do a regression
between two neighboring data points in such a way that not only are the data points
perfectly matched by the resulting, piecewise-defined interpolation function, but also that
the function is continuously differentiable at the data points. Continuous differentiability
requires at least interpolation of degree 3 — i.e., with cubic splines. However, the