Principles of Mathematics in Operations Research

(Rick Simeone) #1
3.3 Summary for Ax = b 45

Fig. 3.6. Parametric solution: b 6 TZ(A), A : m x n, and r — rank(A)

What if b $. 11(A)? We cannot find a solution. For instance, it is quite
hard to fit a regression line passing through all observations. In this case, we
are interested in the solutions, x, yielding the least squared error ||6 — J4X|| 2.
If & € Af(AT), the projection of b over TZ(A) is the null vector 0. Therefore,
Af(A) is the collection of the solutions we seek.

Fig. 3.7. Unique least squares solution: (A^1 A) is invertible and A^ = (ATA) lA^7

If b is contained totally in neither TZ(A) nor J\f(AT), we are faced with the
non-trivial least squared error minimization problem. If ATA is invertible,
the unique solution is x = (ATA)~^1 ATb as given in Figure 3.7. The regression
line in Problem 3.2 is such a solution. We may use A = QR or A = Q1SQ2
decompositions to find this solution easily, in these ways: Rx — QTb or x —
Q2^Qfb, respectively.
Otherwise, we have many x € R™ leading to the least squared solution as
in Figure 3.8. Among these solutions, we are interested in the solution with

Free download pdf