32,376 research outputs found
Large deviations for a stochastic model of heat flow
We investigate a one dimensional chain of harmonic oscillators in which
neighboring sites have their energies redistributed randomly. The sites
and are in contact with thermal reservoirs at different temperature
and . Kipnis, Marchioro, and Presutti \cite{KMP} proved that
this model satisfies {}Fourier's law and that in the hydrodynamical scaling
limit, when , the stationary state has a linear energy density
profile , . We derive the large deviation
function for the probability of finding, in the stationary
state, a profile different from . The function
has striking similarities to, but also large differences from, the
corresponding one of the symmetric exclusion process. Like the latter it is
nonlocal and satisfies a variational equation. Unlike the latter it is not
convex and the Gaussian normal fluctuations are enhanced rather than suppressed
compared to the local equilibrium state. We also briefly discuss more general
model and find the features common in these two and other models whose
is known.Comment: 28 pages, 0 figure
The large deviation approach to statistical mechanics
The theory of large deviations is concerned with the exponential decay of
probabilities of large fluctuations in random systems. These probabilities are
important in many fields of study, including statistics, finance, and
engineering, as they often yield valuable information about the large
fluctuations of a random system around its most probable state or trajectory.
In the context of equilibrium statistical mechanics, the theory of large
deviations provides exponential-order estimates of probabilities that refine
and generalize Einstein's theory of fluctuations. This review explores this and
other connections between large deviation theory and statistical mechanics, in
an effort to show that the mathematical language of statistical mechanics is
the language of large deviation theory. The first part of the review presents
the basics of large deviation theory, and works out many of its classical
applications related to sums of random variables and Markov processes. The
second part goes through many problems and results of statistical mechanics,
and shows how these can be formulated and derived within the context of large
deviation theory. The problems and results treated cover a wide range of
physical systems, including equilibrium many-particle systems, noise-perturbed
dynamics, nonequilibrium systems, as well as multifractals, disordered systems,
and chaotic systems. This review also covers many fundamental aspects of
statistical mechanics, such as the derivation of variational principles
characterizing equilibrium and nonequilibrium states, the breaking of the
Legendre transform for nonconcave entropies, and the characterization of
nonequilibrium fluctuations through fluctuation relations.Comment: v1: 89 pages, 18 figures, pdflatex. v2: 95 pages, 20 figures, text,
figures and appendices added, many references cut, close to published versio
Asymptotic description of stochastic neural networks. I - existence of a Large Deviation Principle
We study the asymptotic law of a network of interacting neurons when the
number of neurons becomes infinite. The dynamics of the neurons is described by
a set of stochastic differential equations in discrete time. The neurons
interact through the synaptic weights which are Gaussian correlated random
variables. We describe the asymptotic law of the network when the number of
neurons goes to infinity. Unlike previous works which made the biologically
unrealistic assumption that the weights were i.i.d. random variables, we assume
that they are correlated. We introduce the process-level empirical measure of
the trajectories of the solutions to the equations of the finite network of
neurons and the averaged law (with respect to the synaptic weights) of the
trajectories of the solutions to the equations of the network of neurons. The
result is that the image law through the empirical measure satisfies a large
deviation principle with a good rate function. We provide an analytical
expression of this rate function in terms of the spectral representation of
certain Gaussian processes
Random Recurrent Neural Networks Dynamics
This paper is a review dealing with the study of large size random recurrent
neural networks. The connection weights are selected according to a probability
law and it is possible to predict the network dynamics at a macroscopic scale
using an averaging principle. After a first introductory section, the section 1
reviews the various models from the points of view of the single neuron
dynamics and of the global network dynamics. A summary of notations is
presented, which is quite helpful for the sequel. In section 2, mean-field
dynamics is developed.
The probability distribution characterizing global dynamics is computed. In
section 3, some applications of mean-field theory to the prediction of chaotic
regime for Analog Formal Random Recurrent Neural Networks (AFRRNN) are
displayed. The case of AFRRNN with an homogeneous population of neurons is
studied in section 4. Then, a two-population model is studied in section 5. The
occurrence of a cyclo-stationary chaos is displayed using the results of
\cite{Dauce01}. In section 6, an insight of the application of mean-field
theory to IF networks is given using the results of \cite{BrunelHakim99}.Comment: Review paper, 36 pages, 5 figure
Sample path large deviations for multiclass feedforward queueing networks in critical loading
We consider multiclass feedforward queueing networks with first in first out
and priority service disciplines at the nodes, and class dependent
deterministic routing between nodes. The random behavior of the network is
constructed from cumulative arrival and service time processes which are
assumed to satisfy an appropriate sample path large deviation principle. We
establish logarithmic asymptotics of large deviations for waiting time, idle
time, queue length, departure and sojourn-time processes in critical loading.
This transfers similar results from Puhalskii about single class queueing
networks with feedback to multiclass feedforward queueing networks, and
complements diffusion approximation results from Peterson. An example with
renewal inter arrival and service time processes yields the rate function of a
reflected Brownian motion. The model directly captures stationary situations.Comment: Published at http://dx.doi.org/10.1214/105051606000000439 in the
Annals of Applied Probability (http://www.imstat.org/aap/) by the Institute
of Mathematical Statistics (http://www.imstat.org
- …