Markov Processes with State-Dependent Failure Rates and Application to RED and TCP Window Dynamics
Baras, John, S.
Date: June 18 - June 20, 2003
This paper presents a mathematical technique for computing the stationary distribution of Markov processes that evolve deterministically between arbitrarily distributed ‘failure events’. The key innovation in this paper is the use of a state-dependent time rescaling technique, such that the re-scaled process can be described by a Poisson-interrupted stochastic differential equation. This technique is first applied to compute the stationary window distribution of a TCP flow performing idealized classical congestion avoidance under variable, but state-dependent, packet loss, and subsequently, to study the distribution of a TCP flow performing generalized congestion avoidance. We show how the stochastic differential equation can be solved by a rapidly convergent numerical technique to obtain the stationary distribution in the re-scaled (subjective) time, and present the re-scalings needed to eventually obtain the distribution of the original Markov process. We demonstrate how this analysis can be used to compute the window distribution of a TCP flow interacting with a RED, ERD or ECN queue, with or without minimally assured throughput guarantees.