Convergence of Stochastic Vector Quantization and Learning Vector Quantization under Bregman Divergences

Convergence of Stochastic Vector Quantization and Learning Vector Quantization under Bregman Divergences

Title : Convergence of Stochastic Vector Quantization and Learning Vector Quantization under Bregman Divergences
Authors :
Baras, John S.
Mavridis, Christos N.

Conference : 21st International Federation of Automatic Control 2020 (IFAC) pp. 2244-2249 , Berlin
Date: July 12 - July 17, 2020

Stochastic vector quantization methods have been extensively studied as supervised (classification)and unsupervised (clustering) learning algorithms, due to being online, data-driven, interpretable,robust, and fast to train and evaluate. As prototype-based methods, they depend on a dissimilarity measure, which, can be shown that, is both necessary and sufficient to belong to the family of Bregman divergences, if the mean value is used as the representative of the cluster.  In this work, we investigate the convergence properties of stochastic vector quantization (VQ) and its supervised counterpart, Learning Vector Quantization (LVQ), using Bregman divergences.  We employ the theory of stochastic approximation to study the conditions on the initialization and the Bregman divergence generating functions, under which, the algorithms converge to desired configurations. These results formally support the use of Bregman divergences, such as the Kullback-Leibler divergence, in vector quantization algorithms.

Keywords: learning algorithms, stochastic approximation, convergence proofs

 

Download Full Paper