Mathematical Methods for Physics and Engineering : A Comprehensive Guide

(Darren Dugan) #1

31.6 THE METHOD OF LEAST SQUARES


0


0


1


1


2


2


3


3


4


4 5


5


6


7


y

x

Figure 31.9 A set of data points with error bars indicating the uncertainty
σ=0.5onthey-values. The straight line isy=mxˆ +ˆc,wheremˆandˆcare
the least-squares estimates of the slope and intercept.

L∝exp(−χ^2 /2) is Gaussian. From the discussions of subsections 31.3.6 and


31.5.6, it follows that the ‘surfaces’χ^2 (a)=c,wherecis a constant, bound


ellipsoidalconfidence regionsfor the parametersai. The relationship between the


value of the constantcand the confidence level is given by (31.39).


An experiment produces the following data sample pairs(xi,yi):

xi: 1 .85 2.72 2.81 3.06 3.42 3.76 4.31 4.47 4.64 4. 99
yi: 2 .26 3.10 3.80 4.11 4.74 4.31 5.24 4.03 5.69 6. 57

where thexi-values are known exactly but eachyi-value is measured only to an accuracy
ofσ=0. 5. Assuming the underlying model for the data to be a straight liney=mx+c,
find the LS estimates of the slopemand interceptcand quote the standard error on each
estimate.

The data are plotted in figure 31.9, together witherror bars indicating the uncertainty in
theyi-values. Our model of the data is a straight line, and so we have


f(x;c, m)=c+mx.

In the language of (31.92), our basis functions areh 1 (x)=1andh 2 (x)=xand our model
parameters area 1 =canda 2 =m. From (31.93) the elements of the response matrix are
Rij=hj(xi), so that


R=








1 x 1
1 x 2
..
.

..


.


1 xN







, (31.100)


wherexiare the data values andN= 10 in our case. Further, since the standard deviation
on each measurement error isσ, we haveN=σ^2 I,whereIis theN×Nidentity matrix.
Because of this simple form forN, the expression (31.98) for the LS estimates reduces to


aˆ=σ^2 (RTR)−^1

1


σ^2

RTy=(RTR)−^1 RTy. (31.101)

Note that we cannot expand the inverse in the last line, sinceRitself is not square and

Free download pdf