A First Course in FUZZY and NEURAL CONTROL

(singke) #1
5.7. EXAMPLE 2: TRAINING A NEURAL NETWORK 191

0 50 100 150 200 250

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

Bark

Stick

Leaf

Pepper

Figure 5.18. Classification performance of [10, 10, 1] topology for Set 2 features


We now refer to Figures 5.14 and 5.15 in which the neural network was
trained for Set 1 features listed in Table 5.1. From the results illustrated in
Figure 5.14, it is clear that the [10, 1] neural network performance is excellent for
the criteria chosen for classification. With the addition of another hidden layer
with [5, 10, 1] topology, as illustrated in Figure 5.15, the network performance
is even better.


Figures 5.16ó5.18 represent the classification results from neural networks
trained to Set 2 features. From these results, it is clear that while the perfor-
mance criteria was indeed met, as in Figure 5.18, the results are not as tightly
bound as in Figure 5.15. Results in Figure 5.18 from a [10, 10, 1] topology
show improvement over the results from [5, 10, 1] topology illustrated in Figure
5.17. The addition of 5 neurons to thefirst hidden layer shows marked im-
provement in performance. From this, we can conclude that either increasing
the number of neurons or the addition of another hidden layer may improve the
performance. But these additions come at the expense of increased computa-
tion and are possibly less attractive to adopt for on-line implementation of trash
identification.


It should be clear from this example that classification performance does
not necessarily improve with the addition of features. In fact, a simple rule
in pattern recognition and classification is to reduce the dimensionality of the
input space to a ìsufficientlyî small number and yet preserve classification per-
formance.

Free download pdf