In [ 17 ]: plt.plot(x, f(x), ‘b’, label=‘f(x)’)
plt.plot(x, ry, ‘r.’, label=‘regression’)
plt.legend(loc= 0 )
plt.grid(True)
plt.xlabel(‘x’)
plt.ylabel(‘f(x)’)
The result in Figure 9-5 is not really as good as expected based on our previous experience
with monomials. Using the more general approach allows us to exploit our knowledge
about the example function. We know that there is a sin part in the function. Therefore, it
makes sense to include a sine function in the set of basis functions. For simplicity, we
replace the highest-order monomial:
In [ 18 ]: matrix[ 3 , :] = np.sin(x)
reg = np.linalg.lstsq(matrix.T, f(x))[ 0 ]
ry = np.dot(reg, matrix)
Figure 9-5. Regression via least-squares function
Figure 9-6 illustrates that the regression is now pretty close to the original function:
In [ 19 ]: plt.plot(x, f(x), ‘b’, label=‘f(x)’)
plt.plot(x, ry, ‘r.’, label=‘regression’)
plt.legend(loc= 0 )
plt.grid(True)
plt.xlabel(‘x’)
plt.ylabel(‘f(x)’)
Figure 9-6. Regression using individual functions
Indeed, the regression now is “perfect” in a numerical sense:
In [ 20 ]: np.allclose(f(x), ry)
Out[20]: True