Convergence results for the agreement problem on Markovian random topologies

Convergence results for the agreement problem on Markovian random topologies

Title : Convergence results for the agreement problem on Markovian random topologies
Authors :
Baras, John S.
Matei, Ion

Conference : 2011 IFAC World Conference pp. 8860-8865
Date: August 28 - September 02, 2011

We study the linear distributed asymptotic agreement(consensus) problem for a network of dynamic agents whose communication network is modeled by a randomly switching graph. The switching is determined by a finite state, Markov process, each topology corresponding to a state of the process. We address both the cases where the dynamics of the agents is expressed in continuous and discrete time. We show that, if the consensus matrices are doubly stochastic, convergence to average consensus is achieved in the mean square and almost sure sense, if and only if the graph resulted from the union of graphs corresponding to the states of the Markov process is strongly connected. The aim of this paper is to show how techniques from the theory of Markovian jump linear systems, in conjunction with results inspired by matrix and graph theory, can be used to prove convergence results for stochastic consensus problems.

Download Full Paper