A First Course in FUZZY and NEURAL CONTROL

(singke) #1
6.4. EXAMPLE: TEMPERATURE CONTROL 207

the number of outputs we wish to generate. For example, in a system with
one binary output, we will need one output neuron. Selecting the number of
neurons in the hidden layers is by trial and error. To start with, we choose
thefirst layer (the hidden layer) to have 10 neurons and the second layer (the
output layer) to have 1 neuron. All 10 neurons in the hidden layer are assigned
to have tangent-sigmoid nonlinear (squashing) functions. InMatlabthis is
the function ìtansigî. The output neuron is assigned a linear output function
ìpurelin.î Note that ìpurelinî provides both positive and negative outputs
as opposed to ìposlinî which provides only positive output values. We will use
ìtrainlm,î which is theLevenberg-Marquardt optimization algorithm,
as the default neural network training algorithm. The default backpropagation
learning function and mean-squared error performance function are ìlearngdmî
and ìmse,î respectively.
TheMatlabsetup that provides the desired neural network topology is as
follows:


net = newff([min(p) max(p)],[10 1],{ítansigí,ípureliní},
ítrainlmí,ílearngdmí,ímseí);

We also need to specify the number of epochs over which training is performed
and the mean-square error. In this example we setmse=0. 179 and the training
period for 10,000 epochs. Note that the number of epochs is the maximum
number of training cycles that the neural network is allowed to train with the
goal to meet the mean-squared error criterion. This section of theMatlabcode
is as follows:


net.trainParam.epochs = 10000;
net.trainParam.goal = 0.179;

Before proceeding to the training phase, it would be prudent to initialize all the
weights and biases of the neural network. For this we useMatlabís initializa-
tion code as follows:


net=init(net);
net.iw{1,1}
net.b{1}

Now we are ready to train the neural network. The followingMatlabcode
performs training.


net = train(net,p,t);

It is always good to check how well the neural network has been trained by
resimulating the training function, in this case the relay characteristic. The
following simulation and plot commands allow visual confirmation of the trained
neural network response. Figure 6.5 plots the original function with ìoîand
the output ìy1î of the trained function with ì∗.î


y1=sim(net, p);
plot(p,t,íoí,p,y1,í*í);
Free download pdf