Convergence of a Neural Network Classifier

Convergence of a Neural Network Classifier

Title : Convergence of a Neural Network Classifier
Authors :
Baras, John S.
LaVigna, Anthony

Conference : The 29th Conference on Decision and Control (CDC) pp. 1735-1740
Date: December 01 - December 01, 1990

In this paper, we show that the LVQ learning algorithm converges to locally asymptotic stable equilibria of an ordinary differential equation. We show that the learning algorithm performs stochastic approximation. Convergence of the Voronoi vectors is guaranteed under the appropriate conditions on the underlying statistics of the classification problem. We also present a modification to the learning algorithm which we argue results in convergence of the LVQ for a larger set of initial conditions. Finally, we show that LVQ is a general histogram classifier and that its risk converges to the Bayesian optimal risk as the appropriate parameters go to infinity with the number of past observations.

Download Full Paper