On the Problem of Local Minima in Backpropagation


Authored By:Marco Gori and Alberto Tesi
Paper Title:On the Problem of Local Minima in Backpropagation
In:IEEE Transaction on Pattern Analysis and Machine Intelligence
Number 1 Vol. 14
Publication Date: 1992
Pages:76-86
Abstract:Supervised Learning in Multi-Layered Neural Networks (MLNs) has been recently proposed through the well-known Backpropagation algorithm. This is a gradient method which can get stuck in local minima, as simple examples can show. In this paper, some conditions on the network architecture and the learning environment are proposed which ensure the convergence of the Backpropagation algorithm. It is proven in particular that the convergence holds if the classes are linearly-separable. In this case, the experience gained in several experiments shows that MLNs exceed perceptrons in generalization to new examples.
Keywords:Multi-Layered Networks, learning environment, Backpropagation, pattern recognition, linearly-separable classes