Online Deterministic Annealing for Classification and Clustering

Online Deterministic Annealing for Classification and Clustering

Title : Online Deterministic Annealing for Classification and Clustering
Authors : Christos Mavridis and John S. Baras
Journal : IEEE Transactions on Neural Networks and Learning Systems pp. 1-10, DOI: 10.1109/TNNLS.2021.3138676, January 7, 2022

Inherent in virtually every iterative machine learning algorithm is the problem of hyperparameter tuning, which includes three major design parameters: 1) the complexity of the model, e.g., the number of neurons in a neural network; 2) the initial conditions, which heavily affect the behavior of the algorithm; and 3) the dissimilarity measure used to quantify its performance. We introduce an online prototype-based learning algorithm that can be viewed as a progressively growing competitive-learning neural network architecture for classification and clustering. The learning rule of the proposed approach is formulated as an online gradient-free stochastic approximation algorithm that solves a sequence of appropriately defined optimization problems, simulating an annealing process. The annealing nature of the algorithm contributes to avoiding poor local minima, offers robustness with respect to the initial conditions, and provides a means to progressively increase the complexity of the learning model, through an intuitive bifurcation phenomenon. The proposed approach is interpretable, requires minimal hyperparameter tuning, and allows online control over the performance-complexity tradeoff. Finally, we show that Bregman divergences appear naturally as a family of dissimilarity measures that play a central role in both the performance and the computational complexity of the learning algorithm.

Download Full Paper