Pattern Recognition and Machine Learning

(Jeff_L) #1
606 13. SEQUENTIAL DATA

Figure 13.1 Example of a spectro-
gram of the spoken words “Bayes’ theo-
rem” showing a plot of the intensity of the
spectral coefficients versus time index.


forms of sequential data, not just temporal sequences.
It is useful to distinguish between stationary and nonstationary sequential dis-
tributions. In the stationary case, the data evolves in time, but the distribution from
which it is generated remains the same. For the more complex nonstationary situa-
tion, the generative distribution itself is evolving with time. Here we shall focus on
the stationary case.
For many applications, such as financial forecasting, we wish to be able to pre-
dict the next value in a time series given observations of the previous values. In-
tuitively, we expect that recent observations are likely to be more informative than
more historical observations in predicting future values. The example in Figure 13.1
shows that successive observations of the speech spectrum are indeed highly cor-
related. Furthermore, it would be impractical to consider a general dependence of
future observations on all previous observations because the complexity of such a
model would grow without limit as the number of observations increases. This leads
us to considerMarkov modelsin which we assume that future predictions are inde-
Free download pdf