Convergence of Stochastic Vector Quantization and Learning Vector Quantization under Bregman Divergences

Convergence of Stochastic Vector Quantization and Learning Vector Quantization under Bregman Divergences

Title : Convergence of Stochastic Vector Quantization and Learning Vector Quantization under Bregman Divergences
Authors :
Mavridis, Christos
Baras, John, S.
Conference : International Federation of Automation Control 2020
Date: July 12 - July 17, 2020

Stochastic vector quantization methods have been extensively studied as supervised (classification)and unsupervised (clustering) learning algorithms, due to being
online, data-driven, interpretable,robust, and fast to
train and evaluate. As prototype-based methods, they depend
on a dissimilarity measure, which, can be shown that, is
both necessary and sufficient to belong to the family of
Bregman divergences, if the mean value is used as the
representative of the cluster. In this work, we investigate
the convergence properties of stochastic vector quantization (VQ) and its supervised counterpart, Learning Vector Quantization (LVQ),
using Bregman divergences. We employ the theory of
stochastic approximation to study the conditions on the
initialization and the Bregman divergence generating
functions, under which, the algorithms converge to desired
configurations. These results formally support the use of
Bregman divergences, such as the Kullback-Leibler
divergence, in vector quantization algorithms.

Download Full Paper