A First Course in FUZZY and NEURAL CONTROL

(singke) #1
6.5. NEURAL NETWORKS IN INDIRECT NEURAL CONTROL 221

function that updates weight and bias values according to Levenberg-Marquardt
optimization. Other training algorithms must be explored to determine their
effectiveness in meeting desired convergence criteria.


net = newff(minmax(nni),[1 10 1],{ítansigí ítansigí ítansigí},
ítrainlmí,ílearngdmí,ímseí)

Step 6. Initialize the chosen neural network structure.


net = init(net);
net.iw{1,1}
net.b{1}

Step 7. Set the number of training epochs and the desired error tolerance.
You may have to increase the number of epochs if the convergence tolerance
is not met within the number specified. On the other hand, you may have to
increase the tolerance to obtain convergence. Nonconvergence generally is due
to ill-conditioned data. As such, some experimentation is needed in scaling the
training data to obtain a ìsuitableî set.


net.trainParam.epochs = 500;
net.trainParam.goal = 0.0005;

Step 8. Train the neural network for the input-output data.


net=train(net,plantin,plantout);

Step 9. Obtain the neural network response to the random input used for
training.


trainedout=sim(net,plantin);

Step 10. Compare the results of the actual plant response with that produced
by the neural network for the same input.


plot(plantout, íbí); %Actual plant output
hold on;
plot(trainedout, í:kí); %Trained neural network output
grid;
axis([0, 300, -0.1, 0.3]);
xlabel(íTime Stepí); ylabel(íPlant (solid) NN Output (dotted)í);
Free download pdf