27,901 research outputs found
Nonlinear Markov Processes in Big Networks
Big networks express various large-scale networks in many practical areas
such as computer networks, internet of things, cloud computation, manufacturing
systems, transportation networks, and healthcare systems. This paper analyzes
such big networks, and applies the mean-field theory and the nonlinear Markov
processes to set up a broad class of nonlinear continuous-time block-structured
Markov processes, which can be applied to deal with many practical stochastic
systems. Firstly, a nonlinear Markov process is derived from a large number of
interacting big networks with symmetric interactions, each of which is
described as a continuous-time block-structured Markov process. Secondly, some
effective algorithms are given for computing the fixed points of the nonlinear
Markov process by means of the UL-type RG-factorization. Finally, the Birkhoff
center, the Lyapunov functions and the relative entropy are used to analyze
stability or metastability of the big network, and several interesting open
problems are proposed with detailed interpretation. We believe that the results
given in this paper can be useful and effective in the study of big networks.Comment: 28 pages in Special Matrices; 201
The Role of Normalization in the Belief Propagation Algorithm
An important part of problems in statistical physics and computer science can
be expressed as the computation of marginal probabilities over a Markov Random
Field. The belief propagation algorithm, which is an exact procedure to compute
these marginals when the underlying graph is a tree, has gained its popularity
as an efficient way to approximate them in the more general case. In this
paper, we focus on an aspect of the algorithm that did not get that much
attention in the literature, which is the effect of the normalization of the
messages. We show in particular that, for a large class of normalization
strategies, it is possible to focus only on belief convergence. Following this,
we express the necessary and sufficient conditions for local stability of a
fixed point in terms of the graph structure and the beliefs values at the fixed
point. We also explicit some connexion between the normalization constants and
the underlying Bethe Free Energy
How to Couple from the Past Using a Read-Once Source of Randomness
We give a new method for generating perfectly random samples from the
stationary distribution of a Markov chain. The method is related to coupling
from the past (CFTP), but only runs the Markov chain forwards in time, and
never restarts it at previous times in the past. The method is also related to
an idea known as PASTA (Poisson arrivals see time averages) in the operations
research literature. Because the new algorithm can be run using a read-once
stream of randomness, we call it read-once CFTP. The memory and time
requirements of read-once CFTP are on par with the requirements of the usual
form of CFTP, and for a variety of applications the requirements may be
noticeably less. Some perfect sampling algorithms for point processes are based
on an extension of CFTP known as coupling into and from the past; for
completeness, we give a read-once version of coupling into and from the past,
but it remains unpractical. For these point process applications, we give an
alternative coupling method with which read-once CFTP may be efficiently used.Comment: 28 pages, 2 figure
Stabilizing Randomly Switched Systems
This article is concerned with stability analysis and stabilization of
randomly switched systems under a class of switching signals. The switching
signal is modeled as a jump stochastic (not necessarily Markovian) process
independent of the system state; it selects, at each instant of time, the
active subsystem from a family of systems. Sufficient conditions for stochastic
stability (almost sure, in the mean, and in probability) of the switched system
are established when the subsystems do not possess control inputs, and not
every subsystem is required to be stable. These conditions are employed to
design stabilizing feedback controllers when the subsystems are affine in
control. The analysis is carried out with the aid of multiple Lyapunov-like
functions, and the analysis results together with universal formulae for
feedback stabilization of nonlinear systems constitute our primary tools for
control designComment: 22 pages. Submitte
Ergodicity for SDEs and approximations: Locally Lipschitz vector fields and degenerate noise
The ergodic properties of SDEs, and various time discretizations for SDEs, are studied. The ergodicity of SDEs is established by using techniques from the theory of Markov chains on general state spaces, such as that expounded by Meyn-Tweedie. Application of these Markov chain results leads to straightforward proofs of geometric ergodicity for a variety of SDEs, including problems with degenerate noise and for problems with locally Lipschitz vector fields. Applications where this theory can be usefully applied include damped-driven Hamiltonian problems (the Langevin equation), the Lorenz equation with degenerate noise and gradient systems. The same Markov chain theory is then used to study time-discrete approximations of these SDEs. The two primary ingredients for ergodicity are a minorization condition and a Lyapunov condition. It is shown that the minorization condition is robust under approximation. For globally Lipschitz vector fields this is also true of the Lyapunov condition. However in the locally Lipschitz case the Lyapunov condition fails for explicit methods such as Euler-Maruyama; for pathwise approximations it is, in general, only inherited by specially constructed implicit discretizations. Examples of such discretization based on backward Euler methods are given, and approximation of the Langevin equation studied in some detail
A survey of random processes with reinforcement
The models surveyed include generalized P\'{o}lya urns, reinforced random
walks, interacting urn models, and continuous reinforced processes. Emphasis is
on methods and results, with sketches provided of some proofs. Applications are
discussed in statistics, biology, economics and a number of other areas.Comment: Published at http://dx.doi.org/10.1214/07-PS094 in the Probability
Surveys (http://www.i-journals.org/ps/) by the Institute of Mathematical
Statistics (http://www.imstat.org
- …