C++ Neural Networks and Fuzzy Logic
by Valluru B. Rao M&T Books, IDG Books Worldwide, Inc. ISBN: 1558515526 Pub Date: 06/01/95 |
Previous | Table of Contents | Next |
In this chapter, you learned about one of the most powerful neural network algorithms called backpropagation. Without having feedback connections, propagating only errors appropriately to the hidden layer and input layer connections, the algorithm uses the so-called generalized delta rule and trains the network with exemplar pairs of patterns. It is difficult to determine how many hidden-layer neurons are to be provided for. The number of hidden layers could be more than one. In general, the size of the hidden layer(s) is related to the features or distinguishing characteristics that should be discerned from the data. Our example in this chapter relates to a simple case where there is a single hidden layer. The outputs of the output neurons, and therefore of the network, are vectors with components between 0 and 1, since the thresholding function is the sigmoid function. These values can be scaled, if necessary, to get values in another interval.
Our example does not relate to any particular function to be computed by the network, but inputs and outputs were randomly chosen. What this can tell you is that, if you do not know the functional equation between two sets of vectors, the feedback backpropagation network can find the mapping for any vector in the domain, even if the functional equation is not found. For all we know, that function could be nonlinear as well.
There is one important fact you need to remember about the backpropagation algorithm. Its steepest descent procedure in training does not guarantee finding a global or overall minimum, it can find only a local minimum of the energy surface.
Previous | Table of Contents | Next |