Out[27]: [ 4.09 0.5 1.48 -1.85 1.65 4.51 -5.7 1.83 4.42 -4.2 ]
[ 1.23 0.72 1.74 -1.89 1.82 1.28 -2.3 1.88 1.25 -1.23]
As with the noisy data, the regression approach does not care for the order of the
observation points. This becomes obvious upon inspecting the structure of the
minimization problem in Equation 9-1. It is also obvious by the results, as presented in
Figure 9-8:
In [ 28 ]: reg = np.polyfit(xu, yu, 5 )
ry = np.polyval(reg, xu)
In [ 29 ]: plt.plot(xu, yu, ‘b^’, label=‘f(x)’)
plt.plot(xu, ry, ‘ro’, label=‘regression’)
plt.legend(loc= 0 )
plt.grid(True)
plt.xlabel(‘x’)
plt.ylabel(‘f(x)’)
Figure 9-8. Regression with unsorted data
Multiple dimensions
Another convenient characteristic of the least-squares regression approach is that it carries
over to multiple dimensions without too many modifications. As an example function we
take fm, as presented next:
In [ 30 ]: def fm((x, y)):
return np.sin(x) + 0.25 * x + np.sqrt(y) + 0.05 * y ** 2
To visualize this function, we need a grid of (independent) data points:
In [ 31 ]: x = np.linspace( 0 , 10 , 20 )
y = np.linspace( 0 , 10 , 20 )
X, Y = np.meshgrid(x, y)
# generates 2-d grids out of the 1-d arrays
Z = fm((X, Y))
x = X.flatten()
y = Y.flatten()
# yields 1-d arrays from the 2-d grids
Based on the grid of independent and dependent data points as embodied now by X, Y, Z,
Figure 9-9 presents the shape of the function fm:
In [ 32 ]: from mpl_toolkits.mplot3d import Axes3D
import matplotlib as mpl
fig = plt.figure(figsize=( 9 , 6 ))
ax = fig.gca(projection=‘3d’)
surf = ax.plot_surface(X, Y, Z, rstride= 2 , cstride= 2 ,
cmap=mpl.cm.coolwarm,
linewidth=0.5, antialiased=True)
ax.set_xlabel(‘x’)
ax.set_ylabel(‘y’)