Physics and Engineering of Radiation Detection

(Martin Jones) #1

562 Chapter 9. Essential Statistics for Data Analysis


9.9.A Smoothing


As the name suggests, smoothing refers to getting rid of small fluctuations in data. It
is generally done to smooth out the small local fluctuations in the data. In time series
analysis, however, smoothing can cause information to be lost, which of course is
not desirable. For example, data from detectors generally have low level fluctuations
embedded on the signal output. Since these fluctuations can give insight into the
noise sources therefore it is not a good idea to smooth out the data. However, in
some instances one is interested in determining the baseline of the output as shown
in Fig.9.9.1. Baselines are generally determined by taking the average of all the
data points. This gives a constant value that can be used in the computer code as
a reference to determine the parameters related to the pulse. Another smoothing
technique is the so calledmoving average. This involves taking averages of smaller
subsets of the data in succession. For example, let us suppose we want to compute
the moving averages of the data: 3,4,2,6,4,3,2,5,....
To determine the moving average we first take the first three points and calculate
their average, which turns out to be


3+4+2
3

=3.

Next we move one step forward and take the average of three data points excluding
the first one, that is
4+2+6
3


=4.

In the next step we exclude one more point and compute the average of the next
three numbers. Note that one can choose essentially any number of data points in
one set and the choice depends on how the data are varying.


Simple Average
Moving Average

Time

Time

Amplitude

Amplitude

Figure 9.9.1: Simple and moving
average smoothing of the base line
of a typical detector pulse. The
moving averages were calculated
by taking the simple average of 3
point sets in succession such that
at each subsequent step the set is
moved forward by one data point.
Free download pdf