7,894 research outputs found
A simple stability condition for RED using TCP mean-field modeling
Congestion on the Internet is an old problem but still a subject of intensive
research. The TCP protocol with its AIMD (Additive Increase and Multiplicative
Decrease) behavior hides very challenging problems; one of them is to
understand the interaction between a large number of users with delayed
feedback. This article will focus on two modeling issues of TCP which appeared
to be important to tackle concrete scenarios when implementing the model
proposed in [Baccelli McDonald Reynier 02] firstly the modeling of the maximum
TCP window size: this maximum can be reached quickly in many practical cases;
secondly the delay structure: the usual Little-like formula behaves really
poorly when queuing delays are variable, and may change dramatically the
evolution of the predicted queue size, which makes it useless to study
drop-tail or RED (Random Early Detection) mechanisms. Within proposed TCP
modeling improvements, we are enabled to look at a concrete example where RED
should be used in FIFO routers instead of letting the default drop-tail happen.
We study mathematically fixed points of the window size distribution and local
stability of RED. An interesting case is when RED operates at the limit when
the congestion starts, it avoids unwanted loss of bandwidth and delay
variations
Analysis of Multiple Flows using Different High Speed TCP protocols on a General Network
We develop analytical tools for performance analysis of multiple TCP flows
(which could be using TCP CUBIC, TCP Compound, TCP New Reno) passing through a
multi-hop network. We first compute average window size for a single TCP
connection (using CUBIC or Compound TCP) under random losses. We then consider
two techniques to compute steady state throughput for different TCP flows in a
multi-hop network. In the first technique, we approximate the queues as M/G/1
queues. In the second technique, we use an optimization program whose solution
approximates the steady state throughput of the different flows. Our results
match well with ns2 simulations.Comment: Submitted to Performance Evaluatio
Some simple but challenging Markov processes
In this note, we present few examples of Piecewise Deterministic Markov
Processes and their long time behavior. They share two important features: they
are related to concrete models (in biology, networks, chemistry,. . .) and they
are mathematically rich. Their math-ematical study relies on coupling method,
spectral decomposition, PDE technics, functional inequalities. We also relate
these simple examples to recent and open problems
Interacting multi-class transmissions in large stochastic networks
The mean-field limit of a Markovian model describing the interaction of
several classes of permanent connections in a network is analyzed. Each of the
connections has a self-adaptive behavior in that its transmission rate along
its route depends on the level of congestion of the nodes of the route. Since
several classes of connections going through the nodes of the network are
considered, an original mean-field result in a multi-class context is
established. It is shown that, as the number of connections goes to infinity,
the behavior of the different classes of connections can be represented by the
solution of an unusual nonlinear stochastic differential equation depending not
only on the sample paths of the process, but also on its distribution.
Existence and uniqueness results for the solutions of these equations are
derived. Properties of their invariant distributions are investigated and it is
shown that, under some natural assumptions, they are determined by the
solutions of a fixed-point equation in a finite-dimensional space.Comment: Published in at http://dx.doi.org/10.1214/09-AAP614 the Annals of
Applied Probability (http://www.imstat.org/aap/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Mean field convergence of a model of multiple TCP connections through a buffer implementing RED
RED (Random Early Detection) has been suggested when multiple TCP sessions
are multiplexed through a bottleneck buffer. The idea is to detect congestion
before the buffer overflows by dropping or marking packets with a probability
that increases with the queue length. The objectives are reduced packet loss,
higher throughput, reduced delay and reduced delay variation achieved through
an equitable distribution of packet loss and reduced synchronization. Baccelli,
McDonald and Reynier [Performance Evaluation 11 (2002) 77--97] have proposed a
fluid model for multiple TCP connections in the congestion avoidance regime
multiplexed through a bottleneck buffer implementing RED. The window sizes of
each TCP session evolve like independent dynamical systems coupled by the queue
length at the buffer. The key idea in [Performance Evaluation 11 (2002) 77--97]
is to consider the histogram of window sizes as a random measure coupled with
the queue. Here we prove the conjecture made in [Performance Evaluation 11
(2002) 77--97] that, as the number of connections tends to infinity, this
system converges to a deterministic mean-field limit comprising the window size
density coupled with a deterministic queue.Comment: Published at http://dx.doi.org/10.1214/105051605000000700 in the
Annals of Applied Probability (http://www.imstat.org/aap/) by the Institute
of Mathematical Statistics (http://www.imstat.org
Renormalization of radiobiological response functions by energy loss fluctuations and complexities in chromosome aberration induction: deactivation theory for proton therapy from cells to tumor control
We employ a multi-scale mechanistic approach to investigate radiation induced
cell toxicities and deactivation mechanisms as a function of linear energy
transfer in hadron therapy. Our theoretical model consists of a system of
Markov chains in microscopic and macroscopic spatio-temporal landscapes, i.e.,
stochastic birth-death processes of cells in millimeter-scale colonies that
incorporates a coarse-grained driving force to account for microscopic
radiation induced damage. The coupling, hence the driving force in this
process, stems from a nano-meter scale radiation induced DNA damage that
incorporates the enzymatic end-joining repair and mis-repair mechanisms. We use
this model for global fitting of the high-throughput and high accuracy
clonogenic cell-survival data acquired under exposure of the therapeutic
scanned proton beams, the experimental design that considers -H2AX as
the biological endpoint and exhibits maximum observed achievable dose and LET,
beyond which the majority of the cells undergo collective biological
deactivation processes. An estimate to optimal dose and LET calculated from
tumor control probability by extension to cells per -size voxels
is presented. We attribute the increase in degree of complexity in chromosome
aberration to variabilities in the observed biological responses as the beam
linear energy transfer (LET) increases, and verify consistency of the predicted
cell death probability with the in-vitro cell survival assay of approximately
100 non-small cell lung cancer (NSCLC) cells
Analysis of a Reputation System for Mobile Ad-Hoc Networks with Liars
The application of decentralized reputation systems is a promising approach
to ensure cooperation and fairness, as well as to address random failures and
malicious attacks in Mobile Ad-Hoc Networks. However, they are potentially
vulnerable to liars. With our work, we provide a first step to analyzing
robustness of a reputation system based on a deviation test. Using a mean-field
approach to our stochastic process model, we show that liars have no impact
unless their number exceeds a certain threshold (phase transition). We give
precise formulae for the critical values and thus provide guidelines for an
optimal choice of parameters.Comment: 17 pages, 6 figure
- …