|Volume 1: No. 24|
Maureen Caudill [AI Expert, 7/91] gives some hints on selecting the right neural network architecture for your application. (I've added a bit, so don't blame her for errors.) First, of course, make sure that rules or algorithmic solutions are not sufficient. Try to break your application into pieces and analyze each separately.
Mapping problems convert input vectors to a set of known/trained output vectors. The network should interpolate a solution for vectors not previously seen. (This property is especially useful in fuzzy control.) Consider using backpropagation, counterpropagation, or functional-link nets. Backpropagation may be too slow for learning nonmonotonic functions, or for large input vectors, or for large training sets.
Temporal mapping adds the notion of sequence, often by using a shift register as the input. Recurrent networks feed previous outputs back to combine with inputs, permitting temporal differencing and some notion of dynamics. Avalanche and backpropagation networks may also be used. Beware, though: recurrent networks tend to take much more memory and processing time than even backpropagation.
Classification is a mapping problem in which interpolated results are forced to the nearest legal class. This is sometimes achieved with a final winner-take-all associative network.
Associative memory problems require that one of the original training vectors be reproduced. Each training vector is treated as an output class, and interpolation is usually not permitted in the final output. Consider backpropagation, counterpropagation, Kohonen, and crossbar networks.
Stochastic classification must allow for errors in the training data, so input vectors need not map to their originally assigned classes. Consider using a Kohonen network to model the probability distribution function. The learning vector- quantization form of Kohonen network may also be useful.
Categorization or clustering problems have no "correct" output initially; the network's job is to discover cluster centers and use them as output vectors. Consider self-organizing networks such as Kohonen, adaptive resonance (ART), Adaline, and probabilistic nets.